Online Safety Act 2023 · c. 50 Royal Assent · 26 Oct 2023 241 sections · 17 schedules Regulator · OFCOM

Britain's first
duty of care for the open internet.

A systems-level statute that turns content moderation from a choice into a legal obligation, and turns OFCOM, the broadcasting regulator, into the world's most powerful online safety enforcer.

4service types
User-to-user, search, combined, Part 5
3categories
Cat 1 · Cat 2A · Cat 2B
£18M/ 10%
Maximum fine, whichever is higher
17schedules
Including 130+ priority offences
§ 01 · Sections 3–5

Scope decides everything.

The Act doesn't regulate "the internet" as a monolith. It regulates four kinds of service, defined by what users can do on them, and only when those services have meaningful links with the United Kingdom. Tap any service to inspect.

User-to-user service
Section 3(1) · Most regulated
Part 3
Search service
Section 3(4) · Includes a search engine
Part 3
Combined service
Section 4(7) · U2U + public search
Part 3
Provider pornographic content
Section 79 · Direct publication
Part 5
⚓ The "links with the UK" test · Section 4(5)–(6)
A service falls into scope where it has a significant number of UK users, where UK users form a target market, or where it is capable of being used in the UK and presents a material risk of significant harm. Geography on the company registration certificate is irrelevant. Reach is what counts.
§ 02 · The Three Regimes

Three pillars of duty.

Every regulated service inherits at least one of three duty regimes, and many inherit all three. They stack. The regime that applies, and the obligations that flow from it, depend on what the service does, who uses it, and how big it is.

I
Applies to · all Part 3 services

Illegal content

The baseline regime. Every regulated user-to-user and search service must assess and mitigate the risk of illegal content, with priority offences from Schedules 5–7 receiving proactive treatment.

  • Suitable and sufficient illegal-content risk assessment (s.9)
  • Proportionate systems and processes to remove priority illegal content (s.10)
  • Swift takedown when illegal content is detected (s.10)
  • Mandatory CSEA reporting to the National Crime Agency (s.66)
  • User-facing reporting and complaints procedures (ss.20–21)
Section 10 · in force 17 March 2025
II
Applies to · services likely accessed by children

Child safety

Layered on top of the illegal-content regime where a service is likely to be accessed by children. The Act draws a hard line around "primary priority" content: pornography, suicide, self-injury and eating disorder material must be kept away from minors entirely.

  • Children's risk assessment (s.11)
  • Highly effective age assurance for primary priority content (s.12)
  • Mitigation of priority content harmful to children (s.62)
  • Children's access assessment to determine applicability (s.36)
  • Operate against terms of service consistently (s.71)
Section 12 · in force 25 July 2025
III
Applies to · Category 1 and 2A services

Fraudulent advertising & transparency

The largest services carry obligations the rest don't. Paid-for fraudulent advertising must be prevented and removed; transparency reports become mandatory; user empowerment, news content, and content of democratic importance must be protected.

  • Prevent paid-for fraudulent advertising (ss.38–39)
  • Annual transparency reports (s.77)
  • User empowerment tools for adults (s.15)
  • Protection of journalistic and news publisher content (ss.18–19)
  • Optional identity verification for adult users (s.64)
Sections 38–39 · phased to mid-2027
§ 03 · Schedules 5, 6 & 7

The priority-offence map.

"Illegal content" is anchored to existing UK criminal law. Three schedules pick out the offences that demand proactive treatment, not just reactive takedown. Click any category to inspect the offences that pull a service into the proactive regime.

SCH.5
Terrorism
SCH.6
Child sexual exploitation & abuse
SCH.7
Suicide & self-harm
SCH.7
Public order & harassment
SCH.7
Drugs & psychoactive substances
SCH.7
Firearms & weapons
SCH.7
Modern slavery & trafficking
SCH.7
Sexual exploitation
SCH.7
Intimate & extreme imagery
SCH.7
Fraud & financial crime
SCH.7
Foreign interference
SCH.7
Immigration offences
§ 04 · Schedule 11 + SI 2025

The categorisation thresholds.

Beyond the universal duties, the Act places extra obligations on the services that shape the public square. The thresholds, set by secondary legislation, are size-and-shape tests: how many UK users you have, and what you let them do.

Category 1

The largest U2U services

34M
monthly active UK users
Plus a content recommender system.
— or —
7M
monthly active UK users
Plus a recommender system and the ability for users to forward or share content.
Category 2A

The largest search engines

7M
monthly active UK users
Of a search engine that is not a vertical search engine (i.e. one limited to a specific topic, theme, or genre).
Brings the additional fraud-advertising and transparency-reporting duties that apply to the very biggest gateway services on the open web.
Category 2B

U2U services with messaging

3M
monthly active UK users
Plus the functionality for users to send direct messages to other users of the same service.
Captures private-messaging-heavy platforms with significant UK reach. Triggers transparency reporting and category-specific record-keeping.
Six-month average
Reg. 6 · 2025/SI
User numbers are calculated as the mean number of monthly active UK users across the prior six-month period.
OFCOM register
Section 95
OFCOM maintains a public register of categorised services. Designation triggers the obligations in scope.
Emerging services
Section 97
A separate watchlist of services close to Category 1 thresholds, so onboarding to additional duties is not abrupt.
Register publication
July 2026 (planned)
Following a successful Wikimedia Foundation challenge, OFCOM is expected to publish the register in July 2026.
§ 05 · The Duty Stack

Who owes what.

Duties under the Act compound. Every Part 3 service inherits the baseline; services likely accessed by children inherit a second layer; categorised services inherit a third. Pornography providers under Part 5 are governed by a parallel, narrower regime focused entirely on age.

Baseline · all Part 3

The duties every regulated U2U and search provider owes, regardless of size, audience, or category.
  • Illegal-content risk assessment (s.9, s.26)
  • Safety duties about illegal content (s.10, s.27)
  • Content reporting (s.20, s.31)
  • Complaints procedures (s.21, s.32)
  • Freedom of expression & privacy (s.22, s.33)
  • Record-keeping & review (s.23, s.34)
  • CSEA reporting to the NCA (s.66)

Children · likely accessed

Layered duties triggered when the children's-access test is met (s.37).
  • Children's risk assessment (s.11, s.28)
  • Safety duties protecting children (s.12, s.29)
  • Highly effective age assurance for primary priority content
  • Mitigation of priority content harmful to children
  • Reporting non-compliant content to children (s.30)

Category 1 · largest U2U

Additional duties for the most influential user-to-user services.
  • User empowerment assessments & tools (ss.14–15)
  • Content of democratic importance (s.17)
  • News publisher content (s.18)
  • Journalistic content (s.19)
  • Fraudulent advertising (s.38)
  • Identity verification option (s.64)
  • Transparency reports (s.77)

Part 5 · provider pornography

Services that publish their own pornographic content, governed by a single, narrow regime: age.
  • Use highly effective age verification or age estimation (s.81)
  • Ensure children are not normally able to encounter content
  • Maintain records of compliance steps
  • OFCOM guidance on means of compliance (s.82)
⚡ Stacking, not switching
The duty layers compound. A Category 1 service likely to be accessed by children carries the baseline duties, the children's duties, and the Category 1 duties simultaneously. There is no opt-out and no offsetting. Each new layer is additive.
§ 06 · Part 10

The new speech offences.

Beyond duties on platforms, the Act creates new criminal offences for the people who use them. These sit alongside, and in some cases replace, older provisions like the Malicious Communications Act and the controversial section 127 of the Communications Act 2003.

s.179

False communications

Sending a message conveying information the sender knows to be false, intending to cause non-trivial psychological or physical harm, without reasonable excuse. The "knows to be false" bar is deliberately high.

Up to 51 weeks · summary; 12 months on indictment
s.181

Threatening communications

Sending a message conveying a threat of death or serious harm, intending the recipient (or any person) to fear the threat would be carried out. Recklessness as to fear is sufficient.

Up to 5 years on indictment
s.183

Flashing images (epilepsy trolling)

Sending or showing flashing images electronically with intent to cause harm to a person with epilepsy. Codifies what was previously prosecuted under more general provisions.

Up to 5 years on indictment
s.184

Encouraging serious self-harm

Communicating, publishing or displaying content that encourages or assists serious self-harm by another person, intending to do so. A new, free-standing offence.

Up to 5 years on indictment
s.187

Cyberflashing

Sending or giving a photograph or film of any person's genitals, intending to cause alarm, distress or humiliation, or for the purposes of obtaining sexual gratification while reckless as to such effects. Inserted into the Sexual Offences Act 2003.

Up to 2 years on indictment
s.188

Sharing intimate images

Sharing, or threatening to share, an intimate photograph or film without consent. Replaces and broadens the older "revenge porn" offence; the threat-to-share variant is new.

Up to 6 months · summary (sharing); up to 2 years (threats)
◇ Repeals · sections 189–190
The Act repeals sections 1 and 127(2)(a)–(b) of the Malicious Communications Act 1988 and the Communications Act 2003 in respect of the conduct now caught by sections 179 and 181. Free-speech advocates secured a narrower, intent-based test in exchange for losing the broader "grossly offensive" offence.
§ 07 · OFCOM's Roadmap

The phased switch-on.

The Act received Royal Assent in 2023, but it switched on in stages, each gated by OFCOM publishing the codes of practice that tell platforms what "compliance" actually means. We are now mid-rollout.

26 OCT 2023
Royal Assent
The Act becomes law. The legal clock begins; commencement orders follow.
17 MAR 2025
Illegal-harm duties live
Following OFCOM's December 2024 codes, the illegal-content regime becomes enforceable across all Part 3 services.
25 JUL 2025
Child safety duties live
Highly effective age assurance becomes mandatory for primary priority content. Pornography platforms onboard age checks.
EARLY 2026
Categorisation begins
OFCOM invites representations from services it believes meet Category 1, 2A, or 2B thresholds. Final guidance on women & girls online published Nov 2025.
● You are here
JUL 2026
Register published
OFCOM publishes the register of categorised services and consults on the Category-specific additional duties.
MID 2027
Full application
Final policy statements on categorised duties; first transparency reports due in summer 2027.
§ 08 · Schedule 13

One headline number, several teeth.

The Act's financial ceiling is set in Schedule 13. But OFCOM's enforcement toolkit goes well beyond money: business-disruption orders can sever a service from its UK customers, and criminal liability reaches all the way up to senior managers.

Headline maximum
£18Mor 10% of qualifying worldwide revenue, whichever is greater
Failure to comply with a confirmation decision
Imposed where a provider fails to take the steps OFCOM requires to remedy a breach of an enforceable requirement. The percentage applies to the corporate group, not just the legal entity that operates the service.
Relative severity
Daily rate
£100kper day, or proportion of QWR, for ongoing non-compliance
Continuing breach
A separate, accumulating penalty intended to make non-compliance economically irrational. Calculated from the date specified in the confirmation decision until compliance is achieved.
Relative severity
Information offences
£18Mor 10% QWR · Sch. 13 para. 6
Failure to respond, or providing false information
Enforceable against the entity. Senior managers can also face up to two years' imprisonment personally for information offences (s.110), and for non-compliance with confirmation decisions (s.138).
Relative severity
Section 144
Service restriction order
A court-granted order requiring ancillary services (payment processors, ad networks) to stop facilitating a non-compliant service.
Section 146
Access restriction order
A court-granted order requiring ISPs and app stores to block UK access to a non-compliant service. The "kill switch".
Section 121
Technology notice
OFCOM may require accredited technology to detect terrorism or CSEA content. The much-debated "scanning" power.
Section 103
Senior manager named
OFCOM can require a service to designate a senior individual personally accountable for compliance with information notices.
◇ Real-world enforcement · early actions
OFCOM has not been timid. By late 2025 it had launched 5 enforcement programmes and opened 21 investigations. It has issued multi-million-pound fines to age-verification non-compliant adult services and procedural fines for ignoring information requests, signalling that the procedural ceiling and the substantive ceiling are the same number.
§ 09 · Self-Assessment

Where does your service land?

A guided walk through the Act's logic. Answer up to six questions to identify which regime applies and the duties that flow from it. This is a heuristic, not legal advice — sections 3–5, 95 and Schedule 11 govern the actual classification.