Europe's rewrite of the rules of the open internet. The DSA layers obligations onto online intermediaries by reach and impact: the bigger your platform, the heavier your duties of transparency, accountability and risk control. A guide for founders, counsel, and anyone trying to read the regulation without reading 102 pages of it.
Unlike the AI Act's risk pyramid, the DSA stacks. Every intermediary inherits a baseline of obligations; hosts add more; online platforms add more again; very large platforms shoulder the full weight. Click any layer to see what attaches at that level.
The DSA's obligations attach to defined service types. Get the type wrong and the wrong rulebook applies. The taxonomy borrows from the e-Commerce Directive and refines it for the platform economy.
Transmits information without selection or modification. Internet access providers, content delivery networks, public WiFi.
Automatic, intermediate, temporary storage performed for the sole purpose of more efficient transmission.
Stores information at the request of a recipient. Cloud and web hosting; everything that holds user content sits on top of this.
A subset of hosting that disseminates information to the public at the request of users. Marketplaces, app stores, social networks, collaborative-economy platforms.
Online platforms or search engines reaching at least 45 million monthly active recipients in the Union, roughly 10% of the EU population. Designated by Commission decision.
A single grid that answers most first-order DSA questions: pick your tier across the top, run down the rows, and you can see the obligations that attach. Filled dots mean the duty applies; outlined dots mean it does not. Read this once and the architecture of the regulation becomes legible.
A single threshold, average monthly active recipients in the EU, separates the ordinary online platform from the systemic one. Cross it, and a different DSA applies: direct Commission supervision, mandatory risk assessment, audits, data access, the works.
Article 33(1): A platform or search engine is designated a VLOP or VLOSE when it has an average of ≥ 45 million active recipients of the service per month in the Union, calculated over the preceding six months.
Once designated by Commission decision, the provider has four months to comply with Section 5 obligations. The threshold is dynamic: falling below it for a full year removes the designation; the Commission also reviews the figure as the EU population shifts.
VLOPs and VLOSEs must, at least annually, identify, analyse and assess any systemic risks stemming from the design or functioning of their service. The Act enumerates four categories, and the assessment must look at how recommender systems, algorithmic systems, content-moderation systems, applicable T&Cs, and ad systems each contribute.
Including child sexual abuse material, illegal hate speech, terrorist content, IP-infringing material, illegal goods on marketplaces. Spread is the focus, not isolated occurrence.
Charter rights: human dignity, private and family life, personal-data protection, freedom of expression and information, non-discrimination, child rights, consumer protection.
Disinformation campaigns, coordinated inauthentic behaviour, manipulation of public debate or election outcomes, threats to public security.
Including effects on physical and mental wellbeing of users, particularly minors. Covers serious negative consequences to a person's mental health.
The DSA's enforcement architecture is dual-track. National regulators handle most of the regulation; the European Commission handles the largest platforms directly. A coordination body sits between them. This is unusual in EU law, and the reason VLOPs negotiate with Brussels rather than capitals.
Exclusive competence over Section 5 obligations of designated very large platforms and search engines. Investigations, requests for information, on-site inspections, interviews, interim measures, binding commitments, non-compliance decisions, periodic penalty payments.
Independent advisory group composed of all national Digital Services Coordinators, chaired by the Commission. Advises on consistent application; coordinates joint investigations; issues opinions, recommendations and guidance; supports cross-border dispute resolution.
One independent authority per Member State. Receives complaints, coordinates with sectoral regulators, supervises providers established in its territory, designates trusted flaggers and out-of-court dispute bodies, has investigative and enforcement powers.
Member States may designate additional authorities (data protection, audiovisual, consumer) under DSC coordination. Cross-border cooperation, mutual assistance, and joint investigations are required by the regulation.
The DSA was adopted faster than the AI Act and applied earlier. VLOPs felt the regime first, in 2023; the rest of the platform economy followed in early 2024.
Penalties are calibrated to the seriousness of the breach. As with the GDPR and the AI Act, percentages of worldwide annual turnover make the DSA materially significant for global businesses, not merely those headquartered in Europe.
A guided walk through the DSA's logic. Up to five questions to identify which tier applies and surface the obligations that follow. A heuristic, not legal advice. Articles 3, 19 and 33 govern the actual classification.