How Gaming Platforms Will be Affected by Online Regulations

← Blog Home
Table of Contents

In a previous life, I worked at Roblox, one of the leading gaming platforms for users under the age of 18 across the globe.

As we expanded our feature sets, I had a unique challenge on my plate: figuring out how to moderate complex, real-time interactions, specifically voice chat.

It’s no small feat, given the multitude of global laws and regulations that come into play, including issues like wiretapping laws and GDPR considerations, which can affect even seemingly minor actions during a live voice interaction between two users.

Adding to this mix of worldwide regulations are the Digital Services Act (DSA), the recently passed UK Online Safety Act (OSA), and the impending adjustments to Australia’s Online Safety Act.

These new laws add even more complexity to the equation, necessitating a delicate balance between protecting minors, preserving freedom of expression, and upholding existing legal frameworks.

Under these new laws, challenges like proactively moderating voice interactions on gaming platforms are no longer just the right thing to do, they are the required thing to do.

Disregarding these new laws comes at a hefty price, with significant implications for businesses.

Although when we consider tech company regulations, gaming platforms might not be the first thing that springs to mind, they are increasingly converging with broader compliance concerns. So, what do new online regulations mean for gaming platforms?

Let’s dive into our analysis of how the DSA and other pivotal online regulations will impact gaming platforms globally.

1. Codes of Conduct

Both the UK OSA and DSA mandate platforms to follow established or forthcoming codes of conduct, aiming to create consistent user experiences across various platforms, including those in the gaming industry.

This means:

🥊 Combating Illegal Content

The DSA mandates that all online platforms, gaming platforms included, promptly remove any content currently considered illegal, whether offline or online, across all EU member states.

This removal must be accompanied by notification and necessary actions involving all concerned parties, including the EU itself. While the DSA adopts a reactive approach, the UK OSA elevates the game by demanding proactive content removal.

It recommends a blend of automation and human intervention to strike a balance between safeguarding the UK population and preserving their freedom.

👉 To be compliant, we advise a combination of both automation and human-facilitated processes.

This poses a distinct challenge for gaming platforms, where live streams, voice chat, text chat, and real-time interactions are the norm, emphasizing the crucial role of non-static proactive signals and user reports.

These regulations underscore the significance of community engagement for maintaining legal compliance, where your community becomes your eyes and ears on the ground.

 

🗣 Diversity of Opinion

Gaming communities can be diverse and inclusive, but not all are.

The DSA calls for removing content that crosses legal boundaries in terms of harassment and hate speech while protecting freedom of expression, which may vary by country.

On the other hand, the UK OSA prioritizes proactive measures. Companies can use automation to prompt users to reconsider messages containing hate speech, fostering compliance with both the DSA and UK OSA while safeguarding free speech and creating a more inclusive gaming environment.

 

2. Accountability and Transparency

Both the EU’s Digital Services Act and UK Online Safety Act introduce fresh transparency requirements for companies to enhance the accountability measures they are simultaneously implementing.

 

📜 Transparent User Policies

Both the DSA and UK OSA place a premium on transparency.

Under these regulations, gaming platforms must provide clear explanations and avenues for appealing account-related actions.

From my experience, many players are eager to comply but often find themselves navigating uncertain expectations. Establishing a straightforward process for issue resolution is likely to enhance community trust and provide gamers with the clarity they seek.

📊 Transparency Reporting Requirements

Starting in February 2024, the DSA extends its reach to encompass all non-VLOP platforms, including gaming platforms.

Both bills require public disclosure of key information, such as monthly active users and risk assessment, with each bill offering its unique perspective on these criteria.

Moreover, both bills outline various transparency reports that platforms are obligated to make public.

💡 For example, DSA transparency reports should encompass metrics like the number of takedown requests, actions taken, relevant orders from governmental bodies or government-sponsored authorities, and error rates of content moderation automation.

⚖️ Liability Changes

The DSA brings standardized online safety laws to all 28 member states, with a broad focus on preventing harm and safeguarding free expression.

This is evident in the reactive content moderation approach (requiring notice before action) and non-prescriptive content moderation guidelines.

This maintains a degree of freedom for gaming communities while establishing legal boundaries, distinguishing between hate speech and controversial speech for users at large.

Conversely, the UK OSA places a sharper focus on protecting children.

For gaming platforms, this means that behaviors that were previously borderline violations of community guidelines, like potential cyberbullying, are now entirely prohibited in the UK should a child have reasonable access to it.

Ofcom, the regulatory authority, is empowered to impose fines on gaming companies if they allow children (defined as individuals under 18) to access content related to cyberbullying, hate speech, or suicidal promotion; this is a non-exhaustive list.

3. Preparing for the Future

Understanding and adapting to these regulations is no easy task, particularly for gaming companies.

Community norms may clash with the demands of the DSA and UK OSA, especially concerning child protection. The central message here is to proactively address online risks, aligning with the UK OSA's requirements, to reduce the potential impact on your community and business.

🤝 Compliance Strategy

Develop a comprehensive compliance strategy in collaboration with legal counsel. Adding great tools and expertise to the mix will also make your life easier (and cost you less).

This can seem like a big mountain to climb, but you take it step by step, it’s easier than it looks.   

📲 User Experience

The aim is not just to tick regulatory boxes but to enhance safety and user experience on your platform.

Leverage this as a chance to inform your users about your community guidelines and their expected behavior.

💡Explore rehabilitative approaches instead of resorting to punitive takedowns when dealing with non-egregious content.

💬 Community Feedback

The emphasis on transparency and user reporting presents a chance to enhance your internal policies and moderation procedures.

This not only ensures compliance but also embeds safety into your product from the outset, potentially reducing the technical debt caused by future legislation.

Regulations such as the Digital Services Act, UK Online Safety Act, and worldwide efforts in online safety signal a substantial change for platforms hosting user-generated content, including the gaming industry.

Staying ahead means understanding these changes and adapting your strategies accordingly. The sooner you start, the better prepared you'll be for what's coming next.

Here’s some reading to help you get more acquainted with new online regulations:

Meet the Author

Jessica Dees

Jessica is the Director of Trust & Safety Policy and Operations at TrustLab.

Let's Work Together

Partner with us to make the internet a safer place.

Get in Touch