Platform governance and regulation have been salient political issues in Brazil for years, particularly as part of Congress’ response to democratic threats posed by former President Bolsonaro. The question became even more important after the January 8, 2023, attempted insurrection in Brasília,Footnote 1 which many blame on social media.Footnote 2 This includes the newly installed Lula administration. In a letter read at the February 2023 UNESCO “Internet for Trust” global conference, the president, now in his third (nonconsecutive) term in office wrote that the attack on the nation’s seats of power was “the culmination of a campaign, initiated much before, and that used, as ammunition, lies and disinformation,” which “was nurtured, organized, and disseminated through several digital platforms and messaging apps.”Footnote 3 The new administration has made platform regulation a policy priority, with regulatory and administrative pushes across the board.
Concerns have also been raised about platforms’ decisions and the policy investment in the events preceding the 2022 elections and the January 8 attempted insurrection,Footnote 4 particularly given the parallels between the January 8th events in Brazil and the US attempted insurrection on January 6th, 2021, and the apparent disparity in treatment that each case received from providers such as Meta.Footnote 5
It is clear Brazil has been a battleground where proposals for platform responsibility have been advanced – and disputed. This chapter seeks to provide an overview of existing and proposed frameworks and of recent developments that have been testing the established consensus about related law and policy in Brazil.
6.1 The Current Overarching Framework: Marco Civil da Internet
Marco Civil da Internet (Marco Civil), often referred to as the Brazilian internet bill of rights,Footnote 6 is the main legislation defining platform responsibility. While the nickname might suggest otherwise, Marco Civil is an ordinary statute: law no. 12 965/2014.Footnote 7 Its drafting process, however, was anything but ordinary, following a years-long public consultation held online by the Ministry of Justice, which itself made use of the internet to gather comments and input from numerous civil society organizations and academics.Footnote 8 Marco Civil was the result of a complex political process, which included strong opposition from telecommunication companies, broadcasters and copyright holders, and law enforcement officials.Footnote 9 It managed to overcome that formidable resistance and was picked up with renewed energy after the Snowden revelations showed President Dilma Rousseff was a target of surveillance by US agencies.Footnote 10 Partly because of this unusual constellation of engagement and participation from a wide range of sectors in Brazilian society, Marco Civil has been regarded as if possessing a distinct political status.Footnote 11 Proposals for amending the statute are often met with concern with perturbing the viable consensus codified into law by Marco Civil.Footnote 12
Under Marco Civil, platforms, like other internet application providers,Footnote 13 are only liable for user-generated content if they fail to comply with a court order directing them to make the content unavailable. This was a response to prior case law, which reformers saw as incentivizing removal of content whenever a notice was served. Introduced in the second round of public consultations on the draft bill (i.e., before the bill was introduced in Congress), this safe harbor went beyond the notice-and-takedown model that had been featured in the draft,Footnote 14 itself a departure from a more stringent liability favored in some casesFootnote 15 and often regarded as flowing from tort law in Brazil.Footnote 16 Under art. 19 of Marco Civil, upon service of “a specific court order,” an internet application provider is required to “take the steps” to make the content unavailable, “within the scope and technical limits of its service.” Such court orders must include “clear and specific identification” (art. 19, § 2, Marco Civil), which the Superior Court of Justice has interpreted as requiring that plaintiffs provide URLs (uniform resource locators).Footnote 17
There are two carve-outs to this judicial safe harbor in Marco Civil itself: copyright claims and nonconsensual intimate content. Nonconsensual intimate content operates under a notice-and-takedown model, though one provided explicitly by Marco Civil. Internet application providers are liable for harms resulting from such content if they fail to diligently make it unavailable “within the scope and technical limits of its service,” as provided by art. 21, Marco Civil, which also requires the notice to include information that enables the “specific identification” of the infringing content, in addition to verification that the request is legitimate (i.e., is made, or authorized, by the victim).
Copyright claims were excluded from the general framework established in art. 19 to overcome the gridlock blocking Marco Civil from advancing in Congress because of intense lobbying from copyright holders.Footnote 18 As Marco Civil was debated around the same time copyright reform was a hot topic, the applicable regime was deferred to a later time. That reform never came, however, so the interim rule included in Marco Civil applies to this day. Under art. 31, liability for copyright infringement resulting from user-generated content follows the Copyright Act (law no. 9.610/1998). Courts have still not settled on what the 1998 statute entails for online intermediaries but have leaned toward “the adoption of a notice-and-takedown regime for copyright infringement, following the model established in the United States by the Digital Millennium Copyright Act and in Europe by the e-Commerce Directive.”Footnote 19
Marco Civil seemed stable for many years, despite the fact that the Supreme Court had a dormant challenge in its docket since 2017.Footnote 20 After the events of January 8, 2023, however, the Supreme Court twice announced a date for ruling on the challenge. As of December 2023, the case is still pending, but many expect that the justices would invalidate or at least significantly reduce protections for providers. This creates pressure for Congress to pass legislation before the Court takes the question of platform governance for itself.
6.2 Liability beyond Marco Civil
Marco Civil governs civil liability for user-generated content and on its own terms does not cover other kinds of liability, such as criminal liability. Generally, failure to comply with a court order can lead to charges for noncompliance with a legal order issued by an official (including a court), which, under art. 330 of the Criminal Code, is punishable with imprisonment of up to six months. Prosecution for noncompliance with removal orders is rare. In 2012, a Google executive was arrested in connection with an order to take down a YouTube video,Footnote 21 but this was before Marco Civil went into effect. Noncompliance charges when platforms fail to provide data requested by courts are not unheard of, but also not common. A Facebook vice president was held in custody for about twenty-four hours in 2016 when WhatsApp did not provide data a court was seeking.Footnote 22
In general, criminal legislation does not specify liability for online intermediaries or platforms, but scienter requirements applicable to most criminal offenses in Brazil prevent providers from being held liable for criminal content shared by third parties. Child safety legislation does establish a regime for provider liability for child sexual abuse material (CSAM), adopting a notice-and-takedown regime.Footnote 23 “In practice, this means that service providers must respond to takedown notices to avoid criminal liability under [child safety legislation].”Footnote 24
6.2.1 Electoral Law
Electoral law was also interpreted as not directly affected by Marco Civil, and indeed even after its entry into force electoral courts in Brazil sometimes held providers liable for infringements upon notice served by the affected person, party, or coalition.Footnote 25 This was both the result of interpreting electoral legislation and of courts issuing regulation.Footnote 26 More recently, however, both electoral legislation and its regulation have been adjusted to follow the judicial safe harbor found in Marco Civil.Footnote 27
6.2.1.1 The 2022 Elections and Eleventh-hour Changes
In the most recent presidential elections, while that model was kept in place, the Superior Electoral Court enacted a resolution after the first round and just ten days before the runoff, which it described as aiming to bring more effectiveness to combating electoral dis- and misinformation. When proposing the resolution, Justice Alexandre de Moraes, who had taken office as the court’s president in late August 2022, cited a 47 percent increase in “disinformation reports sent to digital platforms” in the twelve days following the first round.Footnote 28
Under resolution no. 23714/2022 (art. 2), “platforms”Footnote 29 are required to take down content within two hours when so ordered by the Superior Electoral Court; noncompliance triggers substantial fines (up to 150,000 Brazilian Reais – over 100 times the minimum wage). The same rules apply for takedown orders against accounts and channels found to be engaged in “systematic generation of disinformation,” defined as “persistent publication of false or decontextualized information” (art. 4). Previously, orders had to be complied within twenty-four hours, unless exceptional circumstances warranted a tighter deadline (art. 38, § 4, resolution no. 23610/2019). The president of the Superior Electoral Court was also given authority to act sua sponte. Under art. 3 of resolution no. 23714/2022, the president can issue removal orders for any content found to be identical to content previously ruled illegal by the Court. Further, the president is empowered to order the temporary ban (for up to twenty-four hours) of a platform after “repeated noncompliance” (art. 5) with orders on the basis of the resolution.
Records of proceedings based on resolution no. 23714/2022 have been sealed, so a full accounting of its impact cannot be established as of now. A few non-sealed records show that a department within the electoral court, AEED (Assessoria Especial de Enfrentamento à Desinformação, roughly the Special Unit for Combatting Disinformation), was charged with social media monitoring and presenting the president with a proposal for action under the resolution. In late October 2022, before the runoff, the department compiled a list of over 200 links to social media posts (on Facebook, Instagram, Kwai, Telegram, TikTok, Twitter, and YouTube), grouped under “narratives” (e.g., “unspecified or generalized fraud,” or “video points to 55% of the vote for one of the candidates and claims TSE mistake”), alongside engagement figures for the content and the following count for the account. Justice de Moraes issued an order for the removal of only part of the list prepared by the department, 135 URLs in total, which either garnered more than 500 engagements or were posted by users with followings larger than 5,000. In an indication of what the president interpreted as “identical content” for the purposes of art. 3 authority to issue sua sponte removal orders – such order suggests it need not be the same material (i.e. video, image, or text), and perhaps not even the same allegations previously adjudicated by the Court.Footnote 30 A different report submitted by the AEED instead emphasized that content listed for removal reproduced in total or in part adjudicated as illegal in two full-court cases it specified. Justice Moraes agreed and issued the order.Footnote 31
It is unclear how long the Court sees the resolution to be in effect. Initially, it looked like the regulation would be in force only for the 2022 elections. After the election results were announced, and as Bolsonaro supporters blocked roads in an effort to prevent Lula from being certified as the president-elect, the AEED submitted a report showing a member of Congress’ lower chamber, Carla Zambelli, using social media to commend truck drivers who blocked major highways and to reshare videos of road blocks. This led to an order for the blocking of her social media accounts, which was followed by a further order directed at dozens of other accounts that had, according to the Court’s finding, shared like content supporting Zambelli. This last order included at least two accounts (with substantial followings) that shared Zambelli’s content to express their disapproval of her conduct. One such user had been tweeting his support for the Court’s takedown decisions and criticism against Zambelli and Bolsonaro. Another user used his display name on Twitter to announce his vote for Lula. The Court eventually rescinded the order against these accounts in February 2023; the restriction was in effect for three months.Footnote 32
It is still early to assess the impact of these eleventh-hour changes to electoral regulation, and a full accounting might not be possible. The Superior Electoral Court has rejected access to information requests, even on general enforcement figures, placed by the media.Footnote 33 We know of at least four orders issued by the president of the Superior Electoral Court citing the new resolution as a basis; these orders included content and account takedowns with the two-hour deadline; there have been no reports of temporary platform bans grounded on resolution 23.714/2022. A constitutional challenge filed by the prosecutor-general had a request for stay denied by the Supreme Court, in a 9-2 vote; majority opinions emphasized a need to show deference to the Superior Electoral Court and the importance of curbing electoral disinformation.Footnote 34
6.3 Uncharted Waters: The Law of Content Moderation
Marco Civil did not address one of the most important questions for platform responsibility today: content moderation. Although it does provide safe harbor from civil liability arising from user-generated content, Marco Civil was not designed to regulate decisions platforms make when removing, restricting, labeling, or deciding not to act on content. Content moderation was not a primary concern in the drafting of Marco Civil; what drove the adoption of the intermediary immunity regime established in art. 19 of the statute was the risk that platforms would be forced to remove content to avoid costly damages awards being entered against them by courts not too protective of freedom of expression.Footnote 35
This makes sense given that the 2014 Marco Civil was enacted prior to the turn toward platform responsibility and away from intermediary liability as the theoretical framework and regulatory toolkit favored by policymakers since 2015.Footnote 36 As attention to the impact of content moderation has grown, however, Marco Civil has had little to say. The focus on liability has meant it “provides no scrutiny over the vast majority of online content related disputes,”Footnote 37 which are not brought before courts. Clara Keller, a Brazilian scholar, believes this shows how the celebrated internet bill of rights is insufficient, as it lacks mechanisms “to assure a responsive, transparent and human rights aware governance framework.”Footnote 38
In fact, what Marco Civil’s silence means for content moderation has been contested. Most have pointed to debates in the public consultation and to legislative history as clearly establishing that its intermediary liability regime should be read only as a rejection of a notice-and-takedown model and therefore not as preventing providers from engaging in content moderation.Footnote 39
Prior to 2021, this seemed like the settled interpretation of the law. Yet, in 2021, then-President Bolsonaro adopted provisional legislationFootnote 40 that endorsed the opposite interpretation. Enacted on the eve of Brazil’s Independence Day, when protests endorsed by the president included calls for ousting and arresting members of the Supreme Court, the legislation dragged content moderation regulation to the center of the political crisis in the country.Footnote 41 The explanatory memorandum to the provisional legislation, MP 1068/2021 (which would have amended Marco Civil), advanced the notion that under Marco Civil much of content moderation was illegal because it violated users’ free speech rights. The provisional legislation required social media to limit content moderation to a specified list of illegal content, which notably did not include misinformation.Footnote 42 The provisional legislation was quickly rejected by Congress, without ever being advanced to a vote, on the same night when the Supreme Court enjoined its enforcement.Footnote 43 Yet other courts have held that Marco Civil precludes content moderation of lawful speech.
Even if that interpretation that outright bars content moderation is rejected, what rights users have against platforms when their content is removed or otherwise restricted is far from a settled question. In contrast to US First Amendment doctrine, which (at least prior to the NetChoice cases) is generally regarded as granting editorial rights to platforms,Footnote 44 and to an expansive view of the Communications Decency Act that immunizes platforms from content moderation decisions generally,Footnote 45 under Brazilian law it would be unusual to claim that platforms have free speech rights – trumping those of users. This is particularly true given that the Brazilian legal system is generally understood to recognize that constitutional rights apply horizontally, that is, between individuals and other persons, not just against the state.Footnote 46 Marco Civil itself is evidence that users hold rights, potentially constitutional rights, against providers abridging their freedom of speech or violating their privacy; contractual provisions infringing on such rights are “null and void” under art. 8.Footnote 47
The scope and extent of users’ rights against platforms are not defined; one possibility would be to view those rights as equivalent to those individuals hold against the state.Footnote 48 This was arguably the view behind the provisional legislation issued by President Bolsonaro – although the provisional legislation did not conform to constitutional limits on speech regulation,Footnote 49 employed vague and imprecise concepts,Footnote 50 and would have had the effect of co-opting platforms “to stay on the government’s good side.”Footnote 51 A second possibility would be to try to reconcile users’ rights with platforms’ by admitting that platforms enact (and enforce) their own content policies and grant users due process rights, with judicial review of the justification offered by platforms for content moderation decisions.Footnote 52 A third possibility would look beyond fairness in content moderation and to constrain the latitude platforms have in establishing their content policies to restrict protected speech. Court cases where users seek to have their content or accounts reinstated – and prevail on their claims – are common,Footnote 53 yet case law has so far not articulated a theory of users’ free speech rights against platforms, a question that is ordinarily left unaddressed.
6.3.1 Regulating Content Moderation with Consumer Law
Courts often rely on consumer protection law when deciding content moderation cases. Consumer protection has a wide scope in Brazilian law. Especially after the enactment of the 1990 Consumer Protection Code (CDC, after “Código de Defesa do Consumidor,” law no. 8078/1990), consumer protection had a transformative impact in the legal system, affecting much of private law, as well as other areas of the law.Footnote 54 Consumer protection modifies the application of ordinary legal concepts and provisions, working from the premise that contract law, for example, should apply differently given the highly asymmetrical character of the relationship between a consumer and a provider of (consumer) goods and services. Given that this asymmetry can also be said to be present in the relationship between users and platforms, and considering that Brazilian courts have long held that consumer protection law should apply even for services nominally provided free of charge, it is no surprise that courts have turned to the CDC to adjudicate claims against platforms.
Consumer protection has recently also opened a new avenue for probing content regulation in court. Prosecutors are allowed to bring consumer protection claims, including class actions. A civil investigation often precedes the filing of such claims. In November 2021, a federal prosecutor in Brazil started an investigation into the content moderation practices of Facebook, Instagram, Telegram, TikTok, Twitter, and YouTube, citing misinformation and online violence. The investigation has served subpoenas to the companies for information on their content policies and enforcement mechanisms, including the number of employees working with Brazilian users.Footnote 55
The theory behind the investigation seems to be that platforms are legally liable for harms resulting from insufficient measures to prevent misinformation, particularly COVID-related, and violence online, as that violates users’ consumer-protection rights to accurate information and to safety. The prosecutor states this liability would not be covered by the Marco Civil safe harbor given that such measures are incumbent on platforms themselves and do not implicate liability for losses resulting from user-generated content. Although, as noted, Marco Civil has been interpreted as not regulating content moderation, the prosecutor has suggested that the “lack of a minimal content moderation policy” likely “violates constitutional rights.” This reading of the law has yet to be tested in court. Invoking consumer protection legislation would entail that consumer protection agencies would have authority over platform content governance. Such agencies often have no insulation from the executive; the head of the federal agency serves at the pleasure of the president.
6.3.2 School Shootings and Executive Branch Regulation
Indeed, this was the state of affairs in early 2023, when the then justice minister announced regulation to respond to an outbreak of school shootings,Footnote 56 relying on the Consumer Protection Code. Under portaria 351/2023, the consumer protection department of the justice ministry is empowered to start administrative proceedings against “social media platforms” (a term not defined in the regulation) for failing to uphold a “general duty of safety and care with regard to the dissemination of illicit, harmful and deleterious content, regarding content that encourages attacks against the school environment or conveys apology or incitement to such crimes or their perpetrators” (art. 2). It can request that platforms provide information on their efforts to tackle such content (art. 3), as well as reports on the assessment and mitigation of systemic risks (art 4). In “extraordinary circumstances,” the ministry can itself order platforms to adopt “crisis protocols” (art. 7). Additionally, the regulation also creates a database of such illegal content and notes that hashes may be generated to help in the identification of infringing content (art. 6). Such database was to be managed by the department of public safety, which was also given authority to “advise” platforms to block IP addresses to prevent the creation of new accounts with the same IP address “where previous illegal activity” was identified (art. 5, § 2).
The new regulation on school violence prompted criticism, including from the prosecutor overseeing the civil investigation into platforms.Footnote 57 In Congress, members of the opposition introduced a resolution (projeto de decreto legislativo 122/2023) that would rescind the order as ultra vires. The concern voiced by different actors is that the regulation would allow the government to order content takedowns and regulate platforms directly. That is generally disallowed under art. 19 of Marco Civil (which requires “a specific court order” for content restrictions). The ministry sidesteps that by relying on consumer protection legislation as a source of administrative law authority; again, Marco Civil textually governs civil liability for user-generated content.
The regulation also raises further concerns with this mode of platform regulation: lack of transparency in platform–government engagement. Almost a year after the regulation was passed, the government still has not issued any reports or provided data on how it used its newly established regulatory authorities. Any access to information requests were denied access to the systemic risk assessment reports prepared by the companies and to what sort of information the government directed companies to provide under art. 3 of the regulation.Footnote 58 Access to case documents was also denied.
The limited information produced in response to the access to information request suggests that the government did not make full use of the authorities created by the regulation – even though that does not say much about its engagement with the platforms, which was kept confidential. The government responded that it had started proceedings against Meta, Google, Twitter, Telegram, Bytedance, and Kwai. It also stated that it had not created the hash database under art. 6, while it had requested the removal of “hundreds of publications, accounts, [and] channels” and shared with platforms “a list of hashtags and the most commonly used images by suspects so that [platforms] could attempt to automate filtering.” The government also said that it had not advised platforms to block user IP address under art. 5, § 2, nor had it created the hash database established by art. 6.
6.4 Scope and Jurisdictional Questions
Another important point for platform responsibility that is also unsettled concerns territorial and extraterritorial scope. The relevant provision of Marco Civil is the subject of an ongoing dispute. It states that any data processing operation shall comply “with Brazilian legislation and the rights to privacy, to personal data protection, and to the confidentiality of private communication and of records” (art. 11, caput, Marco Civil). A further provision specifies that this applies to providers not incorporated in Brazil if they “offer services to the Brazilian public or at least one entity in the same business conglomerate is established in Brazil” (art. 11, § 2, Marco Civil). There are two contrasting views on what this entails for overseas providers, one more expansionist, the other more restrictive.
The more expansionist interpretation of the scope of Marco Civil featured in a recent high-stakes case, which led to an order for blocking access to Telegram nationwide, which was rescinded before it was implemented. Supreme Court justice Alexandre de Moraes had been unsuccessful in serving the app with orders directing it to disable accounts and remove content from public channels and to provide data related to the accounts. The accounts belonged to supporters of President Bolsonaro who are the targets of an investigation, opened by the Supreme Court itself, looking into a possible conspiracy to defame members of the Court and organize attempts to have them forcibly removed. Authorities elsewhere had likewise failed to reach Telegram, informally or through legal service.Footnote 59 Telegram had been sent official notices from the Court through its publicly listed email accounts but never acknowledged receipt. The Court also served a Brazilian firm representing the app on intellectual property matters. When all that failed, the order for the ban of the app until it complied with outstanding orders was entered, on March 17, 2022.Footnote 60
The Court relied on a Marco Civil provision that subjects noncompliance with personal data protection provisions of the statute to a range of sanctions, including suspension and ban of “activities implicating operations falling under art. 11.” It directed internet service providers to block connections to Telegram and for Apple and Google to “prevent users from using the app” and remove it from their respective app stores; all those to whom the order was addressed were given five days to comply. Surprisingly, even though the order was initially sealed, it also warned anyone attempting to “technologically circumvent” the ban to carry on “communicat[ing] with the Telegram [app]” of “criminal and civil liability” and established a 100,000 reais daily fine. This seemed to refer to the use of proxy servers or virtual private networks (VPNs) to bypass blocking by internet service providers.
None of this was ever implemented because Telegram eventually responded. On his Telegram public channel, Telegram CEO Pavel Durov cited “an issue with emails going between our telegram.org corporate addresses and the Brazilian Supreme Court” and “apologize[d] to the Brazilian Supreme Court for our negligence.” Stating that they had “found and processed” the “takedown request,” Durov “ask[ed] the Court to consider delaying its ruling for a few days at its discretion to allow us to remedy the situation by appointing a representative in Brazil and setting up a framework to react to future pressing issues like this in an expedited manner.”Footnote 61 In response, the Court found that Telegram had not complied with all outstanding orders but gave the app additional time to do so, while it also conditioned lifting the nationwide ban on the appointment of a local representative. Telegram had also been ordered to report on “all the steps taken to fight disinformation and fake news.” After further communication from Telegram, including with the appointment of a representative, the Court was satisfied Telegram had complied and eventually rescinded the order.Footnote 62
The interpretation favored by Justice Moraes in the Telegram ban case contrasted with that endorsed by another member of the Supreme Court in an ongoing case connected to the 2016 ban of WhatsApp, which the Supreme Court itself had stayed.Footnote 63 In ADPF 403, which concerns the constitutionality of blocking messaging apps, Justice Rosa Weber concluded that sanctions such as those adopted against WhatsApp in 2016 (and later Telegram, in 2022) went beyond what Marco Civil provided, which in her view limited the suspension or ban of a provider’s operations to infringements of provisions related to data protection and the confidentiality of communications. That is, only for violations of users’ privacy rights would providers be subject to a suspension or ban. This represents a narrower view of the extraterritorial scope of Marco Civil.
The courts have not discussed what is meant by “offering services to the Brazilian public” under art. 11, § 2, Marco Civil; although this could be read as any services accessible by users in Brazil, a similar provision in the European Union’s General Data Protection Regulation (art. 3(2)) has been interpreted by the European Data Protection Board as requiring targeting of residents of the EU, for example, accepting payment in a currency of a EU member-state, operating under a top-level domain level of a EU member-state (e.g., “.de”), or mentioning dedicated addresses or phone numbers to be reach from an EU country, among other factors.Footnote 64
While the Supreme Court has not yet addressed the territorial scope of Brazilian law under art. 11 of Marco Civil, an answer has been suggested in an April 2023 opinion on whether Brazilian officials can directly request data from foreign providers or must go through mutual legal assistance treaties or other judicial assistance procedures. The Court argued that when “the specific circumstances of art. 11 of Marco Civil da Internet and art. 18 of the Budapest Convention [on Cybercrime]” are present, the data can be requested directly. In obiter dicta, the rapporteur’s majority opinion takes the view that art. 11 of Marco Civil establishes that “providers … must submit to [Brazilian] national legislation,”Footnote 65 implying “all legislation.” The rapporteur did, however, caveat that art. 11 of Marco Civil would be addressed directly in ADPF 403 and its sister case, ADI 5527, which are pending.Footnote 66
6.5 The “Fake News Bill”
In 2020, as the pandemic took over Brazil and many were concerned with COVID-related mis- and disinformation, including notably from the president’s social media,Footnote 67 Congress started considering the “fake news bill,” as Senate bill no. 2630/2020 came to be known. The bill has changed considerably since it was introduced and cleared the Senate in July 2020, undergoing two substantive revisions, in late 2021 and early 2023.
At one point, it would have made disinformation illegal and created a traceability mandate for messaging apps. From April 2023, the bill was modified to combat disinformation by curbing the undisclosed use of bots in social media, requiring disclosure of adverts and boosted content, and increasing transparency over content moderation. Instead of traceability, the bill would provide for a data preservation procedure applicable to messaging apps. It would create due process requirements for content moderation and demand information on content moderation processes and decisions.
While the 2021 draft was mostly focused on due process, transparency, and self-regulation, the revised version takes a different approach. Introduced just days before a vote was to be held on the bill, the April 2023 draft changes the intermediary liability regime established in Marco Civil, creating secondary liability for harms (a) resulting from ads or (b) where providers fail to abide by their duty of care obligations in the duration of crisis protocols.
The duty of care obligations are related to content “that may constitute” several categories of criminal offenses, including newly introduced crimes against the rule of law, terrorism, crimes against minors or their apology, and racial discrimination. It also includes references to violence against women and violations of the public health law. It is not often clear when the elements of the relevant criminal offenses are present when it comes to social media content.
Crisis protocols can be activated when systemic risks resulting from a provider’s services are “imminent,” the provider has been negligent toward such risks, or their response is deemed insufficient. The protocols can last for thirty days, with thirty-day extensions possible if there is a “persistence of imminent risks.” Providers will be given a list of “relevant aspects” that mitigation measures must address, which suggests they will have to come to decide which measures to adopt themselves.
The bill does not specify any agency for enforcement, which allows the president to assign to any executive department the powers and delegated regulation established by the bill. This includes BRL50 million fines “per infringement” and a temporary suspension from operating in Brazil. The agency that will be responsible for overseeing the bill was a major point of dispute between the government, the opposition, and the platforms.
Platforms opposed the bill. Google published a blog post, which it featured just below the Google Search home search box in Brazil, arguing the bill would give “broad powers to a governmental body to rule what Brazilians can see on the internet.”Footnote 68 Telegram messaged Brazilian users a statement that claimed the bill would give “censorship powers to the government.” Both provoked energetic responses from Supreme Court Justice Alexandre de Moraes, who issued sua sponte orders against them. Google was directed to take down the blog post; Meta and Spotify were also directed to take down all “ads, texts, and information” referring to the bill as “the censorship bill” (as its detractors call it) or pay hourly fines of BRL150,000.Footnote 69 Telegram, which had messaged Brazilian users with its statement, was ordered to remove it and to distribute a text prepared by Justice Moraes that denounced “patent and illicit disinformation” perpetrated by Telegram; failure to comply would trigger a 72h ban and an initial fine of BRL500,000.Footnote 70 The companies complied. At the request of the prosecutor-general, Justice Moraes also launched a criminal inquiry against Google and Telegram executives, citing crimes against the rule of law,Footnote 71 among others.Footnote 72
As of December 2023, the bill was still pending. The government’s congressional leadership attempted to overcome gridlock by working on another bill, Bill no. 2370/2019, a proposal for broad copyright reform, to which the “publishers’ rights” component of the fake news bill would be attached. Under that proposal, providers would be required to compensate news media for using “journalistic content.” If providers and media companies failed to agree on terms, the bill would create mandatory arbitration. That bill also did not advance, however, because of a dispute on an unrelated copyright question – the compensation streaming services would owe to artists.Footnote 73
6.6 Addendum
As this chapter went to press, X was unavailable in Brazil, as a result of an order issued by Justice Moraes.Footnote 74 To prevent circumvention, the order initially went as far as imposing a general ban on the use of VPNs in the country. Moraes subsequently revised this, but his order still would still fine anyone who used the platform, in a departure from the principle that court decisions are only binding on the parties. The order against X was entered over its noncompliance with account takedown orders. The court escalated to a national blocking of the services after X publicly announced that it would not comply with the takedown commands and closed its Brazilian offices. When that also impeded service of process to X, Justice Moraes additionally directed the company to appoint a representative in Brazil, which it did not. The order blocking X, therefore, identifies as grounds both noncompliance with account takedown orders and with the appointment of a representative order. While I cannot at this stage offer a full-fledged discussion of the blocking of X, the latter point is important as it relates to the discussion of the territorial scope of Marco Civil (Part IV).
Justice Moraes grounded his order on an interpretation of Civil Code provisions on foreign companies “operating” in Brazil.Footnote 75 The order adopts categorical language when it speaks of the requirement to appoint a representative; it does not cabin its interpretation to the particular case. Given that, the implication would seem to be that any internet application provider whose services are available in Brazil must also appoint a representative. It is unclear whether Justice Moraes would be willing to go to such an extent. This is a novel interpretation of Brazilian legislation pertaining to internet application providers. Indeed, Justice Moraes himself did not rely on it even when he issued the ban on Telegram for essentially the same reasons.Footnote 76
From that perspective, the X blocking order reveals that Marco Civil might be eroded even if it is not repealed or held unconstitutional. The protracted questions about how to construe its territorial application clauses would be made irrelevant.
And it is not just the Civil Code provisions on foreign companies that might change the framework. Adding to the consumer protection inroads described above, in 2024, the Justice Ministry’s consumer protection department, Secretaria Nacional do Consumidor, also announced draft rules on data access for researchers and transparency reports by platforms.Footnote 77 Perhaps more importantly, the Superior Electoral Court issued wide-ranging regulations ahead of the 2024 municipal elections.Footnote 78 Such regulations seem to recreate a notice-and-takedown regime for certain categories of content, although there is a push to interpret the language of the provision (art. 9-E) as requiring a court order. More fundamentally, the rules introduce a duty of care for “notoriously false or decontextualizing facts” that might compromise “electoral integrity.” The duty of care provision (art. 9-D) requires application providers to, at a minimum and among others, adapt their content policies, publish transparency reports, and conduct risk assessment and mitigation.
6.7 Conclusion
Brazil’s Marco Civil was once seen as a model. It was something of a national icon for which Brazilians took pride. Its main provision on content created an immunity for providers that was passed in response to findings of liability that were seen as encouraging overblocking. Like other jurisdictions, Brazil opted for a governance model that mostly neutralized courts. Now, like other jurisdictions, Brazil is considering a different approach. While questions about its central provisions remain unanswered ten years on, the whole edifice might either soon be supplanted by new legislation or significantly reconfigured by the Supreme Court. Even if it is never formally repealed, it is already no longer the guiding star it once was for swathes of Brazilian law. Instead, areas of the law once thought subdued by Marco Civil seem on the verge of unseating it as a governance model.