A systems-level statute that turns content moderation from a choice into a legal obligation, and turns OFCOM, the broadcasting regulator, into the world's most powerful online safety enforcer.
The Act doesn't regulate "the internet" as a monolith. It regulates four kinds of service, defined by what users can do on them, and only when those services have meaningful links with the United Kingdom. Tap any service to inspect.
Every regulated service inherits at least one of three duty regimes, and many inherit all three. They stack. The regime that applies, and the obligations that flow from it, depend on what the service does, who uses it, and how big it is.
The baseline regime. Every regulated user-to-user and search service must assess and mitigate the risk of illegal content, with priority offences from Schedules 5–7 receiving proactive treatment.
Layered on top of the illegal-content regime where a service is likely to be accessed by children. The Act draws a hard line around "primary priority" content: pornography, suicide, self-injury and eating disorder material must be kept away from minors entirely.
The largest services carry obligations the rest don't. Paid-for fraudulent advertising must be prevented and removed; transparency reports become mandatory; user empowerment, news content, and content of democratic importance must be protected.
"Illegal content" is anchored to existing UK criminal law. Three schedules pick out the offences that demand proactive treatment, not just reactive takedown. Click any category to inspect the offences that pull a service into the proactive regime.
Beyond the universal duties, the Act places extra obligations on the services that shape the public square. The thresholds, set by secondary legislation, are size-and-shape tests: how many UK users you have, and what you let them do.
Duties under the Act compound. Every Part 3 service inherits the baseline; services likely accessed by children inherit a second layer; categorised services inherit a third. Pornography providers under Part 5 are governed by a parallel, narrower regime focused entirely on age.
Beyond duties on platforms, the Act creates new criminal offences for the people who use them. These sit alongside, and in some cases replace, older provisions like the Malicious Communications Act and the controversial section 127 of the Communications Act 2003.
Sending a message conveying information the sender knows to be false, intending to cause non-trivial psychological or physical harm, without reasonable excuse. The "knows to be false" bar is deliberately high.
Sending a message conveying a threat of death or serious harm, intending the recipient (or any person) to fear the threat would be carried out. Recklessness as to fear is sufficient.
Sending or showing flashing images electronically with intent to cause harm to a person with epilepsy. Codifies what was previously prosecuted under more general provisions.
Communicating, publishing or displaying content that encourages or assists serious self-harm by another person, intending to do so. A new, free-standing offence.
Sending or giving a photograph or film of any person's genitals, intending to cause alarm, distress or humiliation, or for the purposes of obtaining sexual gratification while reckless as to such effects. Inserted into the Sexual Offences Act 2003.
Sharing, or threatening to share, an intimate photograph or film without consent. Replaces and broadens the older "revenge porn" offence; the threat-to-share variant is new.
The Act received Royal Assent in 2023, but it switched on in stages, each gated by OFCOM publishing the codes of practice that tell platforms what "compliance" actually means. We are now mid-rollout.
The Act's financial ceiling is set in Schedule 13. But OFCOM's enforcement toolkit goes well beyond money: business-disruption orders can sever a service from its UK customers, and criminal liability reaches all the way up to senior managers.
A guided walk through the Act's logic. Answer up to six questions to identify which regime applies and the duties that flow from it. This is a heuristic, not legal advice — sections 3–5, 95 and Schedule 11 govern the actual classification.