In our digital era, governments and regulators are actively working to bring order to our online lives and transition the internet into a more regulated landscape.
Both the European Union’s Digital Services Act (DSA) and the United Kingdom’s Online Safety Act (OSA) aim to strike a balance between fostering innovation and safeguarding the internet for generations to come.
With a mid-2024 deadline for compliance, the UK's Online Safety Act is fresh off the presses, as it sits in the final stages of Royal Assessment.
While both the OSA and DSA aim to create safer digital spaces, the two bills are not carbon copies of each other. They differ in scope, specificity, and the obligations imposed on digital platforms.
Unlike its EU counterpart, the UK's bill zooms in on specific legal definitions and puts regulatory power in Ofcom’s hands. While the DSA is broader in scope, covering additional topics like intellectual property, dark patterns, and illegal goods, the OSA adopts a laser-focused, detailed approach, centering specifically on illegal content and the platform requirements for compliance.
Today, we will delve into the key differences between these two regulations, including definitions of illegal content, platform obligations, and protective measures for minors.
NOTE: On October 26th, 2023, the Online Safety Bill became the Online Safety Act - this article has been edited to reflect those changes.
Scope and Focus
DSA's Broad Coverage: The DSA covers a wide range of issues, including intellectual property (IP) infringement, illegal goods, dark patterns, and crisis response.
OSA's Narrow but Detailed Focus: In contrast, the OSA primarily deals with illegal content. However, within this focus, it is a significantly more detailed and granular piece of legislation.
Definitions of Illegal Content
DSA's Flexibility: The DSA broadly defines illegal content, including both online actions that are currently illegal under member state laws, EU treaties, and EU-wide legislation, as well as content or user behavior that was previously unlawful offline but has now transitioned to the online sphere.
OSA's Specificity: The OSA delineates two categories of illegal content—general offenses and "priority offenses," such as child sexual exploitation and abuse and terrorist content, providing in-depth specifications for defining what qualifies as illegal content.
Platform Obligations
DSA’s Notice and Takedown: The DSA functions under a "notice and takedown" complaint system, typically using form-based procedures that draw inspiration from both Germany's NetzDG laws and the eCommerce Directive.
OSA’s Proactive Measures: The OSA requires platforms to actively monitor content, departing from the traditional voluntary approach, with the goal of preventing users from encountering clearly illegal content. It explicitly states that, when technically feasible, companies must employ algorithms or other product features for this purpose.
Classification of Platforms
DSA’s VLOPs and VLOSEs: Under the DSA, services attain the titles of Very Large Online Platforms (VLOPs) and Very Large Online Search Engines when they have or exceed 45 million monthly users.
OSA’s Category 1 Platforms: Conversely, the OSA has not provided explicit details on its approach to assessing company categories other than it will rely on risk evaluations and the size of the UK user base. Once the bill is completely formalized and put into action, Ofcom will undertake these assessments.
Obligations on Smaller Platforms
The OSA classifies any "user-to-user services" as a high-risk category where proactive due diligence is required.
This classification encompasses platforms where content sharing may not necessarily occur, but if Ofcom identifies the potential for such sharing, the platform will still fall into this high-risk category.
The OSA does not create exemptions in its compliance framework based on a company's size, in contrast to the DSA, which sets lower compliance thresholds for smaller companies.
Protection of Minors
Both the DSA and OSA emphasize the importance of protective measures for minors, albeit through different approaches.
The OSA, for instance, establishes rigorous requirements, allocating entire sections of the legislation to address and build safeguards for minors, along with identifying specific content that should remain inaccessible to them.
In contrast, the DSA includes some provisions but maintains a broader scope, drawing heavily from existing EU frameworks, laws, and treaties, while emphasizing compliance with member state laws.
Final Thoughts
The EU’s Digital Services Act and the UK’s Online Safety Act share the common goal of regulating the digital world, yet they each have distinct characteristics.
The DSA takes a comprehensive approach, addressing a wide range of online user concerns, whereas the OSA displays a more specialized focus on combating high-harm illegal content. Furthermore, the OSA emphasizes the importance of proactive monitoring as opposed to the DSA’s reactive notice and takedown procedures.
Regardless of how you slice it, both legislations will necessitate Trust and Safety teams to revamp their traditional approaches to content moderation, content monitoring, content removal, and user behavior management.
These legislations serve as reflections of the evolving nature of our online environment, underscoring the necessity of understanding their unique regulatory nuances.
Stay tuned for ongoing updates as we track how these new requirements are reshaping the internet and the global impact of these regulations on business worldwide.
Recommended reading:
The Moderation Paradox: Free Speech vs. Misinformation & Harmful Content
The UK Online Safety Bill is here: What it Means & How to Prepare