
Brasília — In a landmark decision that reshapes digital governance in Brazil, the country’s Supreme Federal Court (STF) ruled on Thursday that social media platforms can be held civilly liable for harmful content posted by users if they fail to remove it following an extrajudicial notification.
The ruling, passed by an 8–3 vote, declares Article 19 of Brazil’s Marco Civil da Internet — the nation’s main internet framework law — as partially unconstitutional. Under the previous interpretation, platforms were only required to take down content after a formal court order.
The court determined that current protections under Article 19 are insufficient to safeguard fundamental rights such as honor, dignity, and personal image. The justices established a binding precedent that reinterprets the article to allow for liability after an out-of-court notice by the victim or their legal representative.
The decision is expected to lead tech companies to overhaul their moderation protocols, particularly in Brazil, one of the largest social media markets globally. Despite the immediate and potentially far-reaching implications, major platforms have not yet issued public statements.
Proactive Moderation for High-Risk Content
The STF ruling further asserts that platforms must act proactively to remove content related to hate speech, racism, pedophilia, incitement to violence, or advocacy of a coup — even without any notification.
Legal experts suggest the ruling significantly broadens the scope of platform liability and challenges global companies to comply with jurisdiction-specific standards that go beyond the prevalent court-order model.
The Court emphasized that the ruling does not affect Brazil’s electoral legislation, which remains governed by the Superior Electoral Court (TSE). It also reaffirmed the applicability of Article 21 of the Marco Civil, which allows civil liability for content posted by third parties — including inauthentic accounts.
As Brazil awaits legislative updates that may further clarify content liability standards, this judicial move is likely to become a touchstone in international debates over platform responsibility and online free speech.
What Changes with the Supreme Court Ruling
The Brazilian Supreme Federal Court (STF) has established three levels of liability for internet providers and social media platforms:
1. Proactive Removal for Serious Cases
The Court ruled that platforms must act immediately — even without a court order or prior notification — to remove content involving hate speech, racism, child exploitation, incitement to violence, or advocacy of a coup. In such cases, failure to take action may lead to direct civil liability for the platform.
2. Extrajudicial Notification as a Trigger
For other types of illegal content, platforms may be held liable if:
-
they receive an extrajudicial notification,
-
fail to remove the content, and
-
a court later confirms that the content was indeed unlawful.
This approach relaxes the requirement for a court order, allowing for quicker removal of harmful content such as personal attacks and serious misinformation.
3. Crimes Against Honor
In cases involving offenses against honor — such as defamation — the current rule remains:
Platforms are only required to take down the content if ordered by a court. They will not be penalized for ignoring an extrajudicial notice alone.
This was the STF’s way of safeguarding freedom of expression.