Digital Law in Brazil - Current Hot Topics | Brazil’s Supreme Court Redefines Platform Liability: Toward a Global Model of Digital Diligence

In a landmark decision, Brazil’s Supreme Federal Court (STF) declared the partial unconstitutionality of Article 19 of the Marco Civil da Internet, reshaping the country’s approach to online platform liability. The ruling introduces a layered model of diligence and responsibility that aligns Brazil with evolving global standards such as the EU’s Digital Services Act and other emerging regimes of digital accountability.

A shift from reactive to preventive responsibility

For over a decade, Article 19 of Brazil’s Internet Bill of Rights (Marco Civil da Internet) shielded platforms from civil liability for third-party content unless they failed to remove it after a specific court order. This approach, modeled on liberal free-speech principles, sought to prevent private censorship while safeguarding open expression online.

However, the Supreme Court found that this purely reactive regime failed to provide adequate protection for fundamental rights such as dignity, privacy, and democratic integrity. By ruling the provision partially unconstitutional, the Court signaled a decisive move toward a more preventive and differentiated framework of responsibility.

A layered model of diligence

The STF’s decision establishes a stratified model of platform liability based on the type of content and the degree of control the platform exercises:

  • User-generated content: The STF clarified that the traditional judicial notice and take down regime now applies only to crimes against personal honor, such as defamation, slander, and libel, where platforms are liable only after a court order. For all other third-party content, platforms have a duty to act upon extrajudicial notification, under penalty of liability if they fail to respond, including cases involving inauthentic accounts or prior judicially recognized honor-related offenses. In practice, Article 19 now governs only honor crimes, while the general rule follows the notice-and-takedown framework of Article 21, allowing preventive action without undermining freedom of expression;
  • Sponsored and algorithmically amplified content: Platforms are presumed to have knowledge and control over content they promote or boost. Liability may therefore arise without a court order, unless the provider can demonstrate effective diligence and timely removal of unlawful material;
  • Serious and high-risk crimes: For categories such as incitement to violence, terrorism, hate speech, or child exploitation, platforms must maintain active monitoring systems, transparency reports, and local contact channels in Brazil. The ruling imposes an affirmative duty of care aligned with international standards for systemic risk management.

Global convergence in platform accountability

Brazil’s new framework joins a global movement redefining intermediary responsibility:

  • United States: Section 230 of the Communications Decency Act continues to provide broad immunity, though its limits are under increasing political and judicial scrutiny;
  • European Union: The Digital Services Act (DSA) sets out proactive obligations for risk assessment, transparency, and rapid removal of illegal content — emphasizing accountability and corporate governance;
  • United Kingdom and others: The UK’s Online Safety Act and similar initiatives in Australia, South Korea, and India introduce risk-based models of oversight and content moderation.

In this context, Brazil’s approach stands out for its constitutional foundation and the central role of the judiciary. Instead of relying solely on statutory regulation, the STF interpreted existing law through constitutional principles such as human dignity, freedom of expression, and privacy, embedding platform governance within the broader framework of fundamental rights.

Practical implications for companies and counsel

Digital service providers operating in Brazil — whether local or foreign — will now face new compliance expectations.

They must strengthen internal procedures for:

  • Content moderation and user notification handling;
  • Documentation and traceability of moderation decisions;
  • Transparent reporting and communication channels with users and authorities;
  • Demonstrating “effective diligence” when unlawful content is identified.

For lawyers and compliance professionals, the ruling expands advisory roles beyond litigation. Counsel will increasingly need to help clients design preventive governance frameworks, balancing freedom of expression with the duty to mitigate harm and manage systemic risks.

A constitutional laboratory for digital regulation

The STF’s decision marks a turning point in Brazil’s digital regulation landscape, introducing a principle-based model of responsibility rooted in diligence, transparency, and accountability.

Finally, this landmark ruling suggests that platforms must act with greater care and promptness in protecting users from manifestly illegal content.

In doing so, Brazil positions itself as a constitutional laboratory in the global debate on platform governance — bridging the liberal tradition of the Internet Bill of Rights with the risk-based regulatory models emerging in Europe and beyond.

Source:

Read more   |   PDF Download


Imprimir  

RIO DE JANEIRO

Av. Almirante Barroso, 139 - 7º andar, Centro
Rio de Janeiro - RJ - Brasil, CEP 20.031-005
Tel: +55 21 2524-0510
E-mail: montaury@montaury.com.br

Montaury

SÃO PAULO

Av. Macuco, 726, 2º andar, Moema
São Paulo - SP – Brasil, CEP 04.523-001
Tel: +55 11 3706-2020
E-mail: montaury@montaury.com.br

IDIOMA / LANGUAGE