Regulatory compliance report on OnlyFans under the UK's Online Safety Act. Applicability analysis, obligation mapping, compliance checklist, and recommended actions.
Yes. OnlyFans is a "user-to-user service" (Creators upload content encountered by Fans) with "links with the United Kingdom" (UK-incorporated, significant UK user base). No Schedule 1 exemption applies. OFCOM has already exercised jurisdiction under the predecessor VSP regime, including a GBP 1.05 million fine in March 2025.
With 400M monthly Fans and 4.5M Creators, OnlyFans exceeds any plausible threshold for Category 1 designation under Schedule 11. OFCOM has not yet published the formal register, but the scale and risk profile leave no realistic doubt. The business should prepare for the full suite of Category 1 duties now.
Part 5 does not apply to OnlyFans. All pornographic content on the platform is user-generated (uploaded by Creators), which is explicitly excluded from Part 5 by Section 79(7). The platform does not algorithmically surface or recommend explicit content to users; discovery requires exact username search. This is an important distinction: the pornographic content duties under Part 3 (user-generated content) still apply in full, but the standalone Part 5 regime (designed for services that publish their own pornographic content) does not.
The principal obligations relevant to OnlyFans, structured from highest to lowest regulatory priority.
s.11-12, 35-37. OFCOM does not accept that an 18+ policy prevents children from accessing a service. Formal children's access assessment required. Age verification must be "highly effective." Challenge age was misconfigured (set to 20 instead of 23 for 3+ years, currently 21).
s.9-10. Formal assessment required covering all priority illegal content. CSEA and NCII are highest-risk categories. DMs and livestreaming are highest-risk functionalities. Algorithmic risk is lower than comparable platforms (no content recommendation for explicit material).
s.66-70. As a UK provider, must report all detected CSEA content to the NCA once s.66 commences. Currently reports to NCMEC (US). Parallel NCA pipeline needed. False information is a criminal offence (up to 2 years).
s.10(5), 71-72. Must separately address terrorism, CSEA, and other priority illegal content. Current Complaints Policy excludes content moderation decisions, directly conflicting with the OSA.
s.100-103. All information must be accurate. A named senior manager may face personal criminal liability. Penalties: up to 2 years' imprisonment. Given the GBP 1.05m fine, this is a high-sensitivity area.
s.77-78. OFCOM will specify exact information requirements per Schedule 8. Internal systems must produce data accurately and on demand. Pre-submission audit process recommended.
s.22. The UK ID verification regime engages Article 8 ECHR privacy rights. A documented impact assessment is required, particularly given the invasive nature of the verification process.
s.75. Legally required for categorised services. Given 18+ age restriction, practical relevance is low. A proportionate policy is recommended rather than extensive infrastructure.
The OSA's reach extends beyond platforms. OFCOM has powers over ancillary service providers (payment processors, hosting, app stores). Non-compliant services can face service restriction orders and access restriction orders. OnlyFans' partners may seek compliance assurances as part of their own risk management.
| Action | Lead Team(s) | Timeframe |
|---|---|---|
| Audit age verification configuration against OFCOM guidance | Trust & Safety, Engineering | 0-3 mo |
| Establish OFCOM information-request protocol with mandatory internal verification | Legal, Compliance | 0-3 mo |
| Commission formal illegal content risk assessment (s.9) | Trust & Safety, Legal, Product | 0-3 mo |
| Complete children's access assessment (s.35) and commence children's risk assessment | Trust & Safety, Legal | 0-3 mo |
| Restructure Terms of Service and Complaints/Appeals policies | Legal, Policy | 3-6 mo |
| Design NCA reporting pipeline alongside NCMEC process | Trust & Safety, Legal, Engineering | 3-6 mo |
| Document ECHR impact assessment for age verification | Legal, Privacy/DPO | 3-6 mo |
| Develop proportionate deceased child users policy (s.75) | Legal, Customer Support | 3-6 mo |
| Build transparency reporting infrastructure (Schedule 8) | Data/Analytics, Compliance | 6-12 mo |
| Gap analysis against OFCOM codes of practice | Compliance, all teams | 6-12 mo |
| OSA-specific training (criminal liability emphasis) | HR, Legal, Compliance | 6-12 mo |
The OSA requires three assessments: a children's access assessment, an illegal content risk assessment, and a children's risk assessment.
Cross-functional working group (Trust & Safety, Legal, Product, Data, Privacy) with executive sponsor. Consider external specialists given enforcement history.
Map OnlyFans' characteristics: subscription model, no algorithmic recommendation of explicit content, DM/livestreaming, creator ID verification, 400M+ users.
Each statutory factor: user base, risk per priority illegal content category, functionality risk (DMs, livestreaming), usage patterns, severity, mitigations.
For an 18+ platform, the critical element is demonstrating age gate effectiveness. If highly effective, residual risk is low. Focus on robustness of age verification.
Written records. Reassessment triggers (new features, OFCOM profile changes, annual minimum). Integrate into product development lifecycle.
Pre-populated based on publicly available information. All items should be independently verified against internal records.
| St | Requirement | Ref | Notes |
|---|---|---|---|
| Service Scope | |||
| ✓ | User-to-user service (s.3) | s.3 | Creators upload content encountered by Fans. |
| ✓ | Links with the UK | s.4(5) | UK-incorporated. UK is the home market. |
| ✓ | No Schedule 1 exemption | Sch.1 | No exempt category applies. |
| ✓ | Part 5 not applicable (all pornographic content is user-generated, excluded by s.79(7)) | s.79(7) | Part 3 duties still apply to user-generated pornographic content in full. |
| Categorisation | |||
| ✓ | Category 1 service | s.94-95 | Category 1 given scale (400M Fans, 4.5M Creators). Formal register pending but designation is clear. |
| Children's Access | |||
| ⚠ | Children's access assessment | s.35-37 | 18+ by policy. OFCOM does not treat this as conclusive. |
| ? | Assessment documented | s.36 | VERIFY: Request internal documentation. |
| St | Requirement | Ref | Notes |
|---|---|---|---|
| ? | Completed illegal content risk assessment | s.9 | VERIFY: No public evidence of completion. |
| ⚠ | Measures to prevent encountering priority illegal content | s.10(2) | Hash-matching for CSEA. Terrorism detection unverified. |
| ✓ | Systems to minimise illegal content duration | s.10(3) | Automated detection + human moderation. |
| ⚠ | Terms separately address terrorism, CSEA, other priority content | s.10(5) | AUP prohibits broadly. Not broken out by statutory category. |
| St | Requirement | Ref | Notes |
|---|---|---|---|
| ⚠ | Age verification prevents children encountering primary priority content | s.12(3)-(6) | Challenge age was 20 (not 23) 2021-2025. Now 21. |
| ⚠ | Age verification is "highly effective" | s.12(6) | RISK AREA: Challenge age 21 is narrower buffer than 23. |
| ✓ | Terms indicate 18+ only | s.12(5) | Clear 18+ policy in ToS and AUP. |
| St | Requirement | Ref | Notes |
|---|---|---|---|
| ✓ | CSEA detection systems (hashing) | s.66 | Hash-matching against established databases. |
| ⚠ | Reports all detected CSEA to NCA | s.66(1) | Currently NCMEC (US). Direct NCA pipeline needed. |
| St | Requirement | Ref | Notes |
|---|---|---|---|
| ✗ | Information to OFCOM is accurate | s.102(8) | CRITICAL: Inaccurate data twice. GBP 1.05M fine. |
| ✗ | Timely notification of inaccuracies | s.102(8) | 18 days to report challenge age error. |
| ✗ | Staff awareness of criminal liability | s.109-112 | Accuracy failures suggest gaps. Mandatory training needed. |
| St | Requirement | Ref | Notes |
|---|---|---|---|
| ✗ | Information to OFCOM complete and accurate | s.77(4) | CRITICAL: Fine for inaccurate data. |
| ⚠ | Complaints cover moderation decisions | s.21 | Policy explicitly excludes moderation. Conflicts with OSA. |
| ? | Deceased child users policy | s.75 | New requirement. Low practical relevance for 18+ platform. |
| Area | Rating | Key Issues |
|---|---|---|
| Service Classification | Met | Clearly regulated user-to-user service. Category 1. Part 5 does not apply (content is user-generated). |
| Illegal Content | Unverified | Measures exist but documentation unverified. Algorithmic risk is low. |
| Children's Safety | Partial | Age verification exists but "highly effective" standard questionable. |
| CSEA Reporting | Partial | Hashing in place. NCA direct reporting needs verification. |
| Complaints | Partial | Policy excludes moderation decisions, conflicting with OSA. |
| Information Accuracy | Not Met | CRITICAL: GBP 1.05M fine. Internal QA failed. |
| Age Assurance | Partial | Misconfigured 3+ years. Challenge age (21) may be insufficient. |
| Enforcement Readiness | Partial | Direct experience. Criminal liability awareness needs strengthening. |