TrustAndSafety

DigitalAdvertising

AdTech

BrandSafety

EU’s Digital Services Act: Heavy Fines for Failing to Moderate Harmful Content

2023. 11. 1.

The EU’s Digital Services Act is reshaping online accountability — with massive fines for non-compliance.
As platforms scramble to moderate harmful content, brands must rethink where their ads truly belong.


What Is the Digital Services Act (DSA)?

The Digital Services Act (DSA) is a comprehensive EU regulation designed to hold online platforms accountable for the spread of illegal and harmful content — such as fake news, hate speech, and child exploitation. Platforms must remove such content swiftly and objectively, label AI-generated content, and ban targeted advertising based on sensitive data such as religion, sexual orientation, or content aimed at children and minors.

Failure to comply can result in fines of up to 6% of global annual revenue.

Source: Naver Encyclopedia
Companies affected: Google, Bing, YouTube, Facebook, X (formerly Twitter), Instagram, TikTok, Wikipedia, Apple, AliExpress, LinkedIn, and more


Regulating Big Tech Responsibility

DSA specifically targets Very Large Online Platforms (VLOPs) — those with over 45 million monthly active users in the EU. So far, 17 platforms and 2 search engines have been officially designated, including Google, Meta, X, and TikTok.

According to EU Internal Market Commissioner Thierry Breton,

“Compliance with the DSA will not only prevent penalties but also strengthen brand value and trust for these companies.”

European Commission President Ursula von der Leyen echoed this, saying:

“The DSA aims to protect children, society, and democracy through strict transparency and accountability rules.”


Enforcement Begins: DSA in Action

When misinformation and violent content spread rapidly across platforms following the Israel-Hamas conflict, the EU launched an official DSA investigation into X, questioning its ability to manage illegal content.

X responded that it had removed or labeled tens of thousands of posts and taken down hundreds of Hamas-linked accounts. Meta also reported deleting 800,000+ pieces of war-related content and establishing a special operations center for rapid content review.

Major platforms are now:

  • Removing recommendation algorithms based on sensitive user data

  • Adding public reporting channels for flagging illegal content

  • Filtering extremist or graphic content more aggressively

These actions are motivated by more than goodwill — DSA violations can trigger massive fines or even temporary bans from operating in the EU.


A Broader Vision: EU’s Digital Rulebook

The DSA is part of the EU’s digital governance trifecta, which also includes:

  • DMA (Digital Markets Act): Prevents anti-competitive practices by “gatekeeper” firms like Alphabet, Amazon, Apple, Meta, ByteDance, and Microsoft

  • DNA (Digital Networks Act): Aims to foster a unified digital market and promote investment and innovation in infrastructure and emerging players

Together, these laws enforce transparency, user protection, and fair competition in the EU digital ecosystem.


What About Korea?

While the EU pushes ahead with strong tech regulation, South Korea has yet to enact a comparable law to hold Big Tech accountable for algorithm transparency or content responsibility.

Civil society groups argue that Korea should move toward a comprehensive legislative framework, especially as:

  • Big Tech dominance threatens media diversity

  • Small businesses and content creators are increasingly dependent on platform decisions

  • Algorithmic news feeds raise concerns about information control

According to Oh Byung-il, head of Korea’s Progressive Network Center:
“Korea has long prioritized nurturing its domestic tech industry while overlooking critical issues like privacy and fair trade. The EU’s example shows it’s time for Korea to start serious discussions.”


Final Thoughts

From fake news to hate speech, the DSA reflects a growing global demand for platform responsibility. With major players like X, Meta, and TikTok scrambling to comply, it’s clear that user safety and algorithmic transparency are no longer optional.

In Korea and beyond, it’s time for governments and platforms alike to acknowledge their role in protecting the digital public — and for brands to ask hard questions about where their ads appear and what values they may be unintentionally endorsing.

© 2025 PYLER. All rights reserved.

pylerbiz@pyler.tech | 19th floor, 396, Seocho-daero, Seocho-gu, Seoul, Republic of Korea (06619)

pylerbiz@pyler.tech | 19th floor, 396, Seocho-daero, Seocho-gu, Seoul, Republic of Korea (06619)

© 2025 PYLER. All rights reserved.