9.1 Introduction
The European Court of Human Rights (ECtHR) has long established that freedom of expression is crucial to the existence of a democratic society. A free society is impossible without a free exchange of information and opinions. Freedom of expression also entails both the imparting as well as receiving of information. The internet has given unprecedented power for new platforms to realise this right.
Information society services and especially intermediary services, including social media platforms, have become an important part of our daily lives. At the same time, the digital transformation and increased use of those services has also resulted in new risks and challenges, both for individual users and for society as a whole.Footnote 1
The ECtHR and other bodies in numerous cases have pointed to a positive obligation for the state to protect the right to freedom of expression. At the same time, such an obligation has so far not been expressly attributed to the online environment by the ECtHR, while some other human rights protection bodies and institutions have pointed towards it only indirectly.
Therefore, the aim of this chapter is to analyse the positive obligations of states to protect freedom of expression in the online environment, especially on social media platforms. To achieve this aim, the first part of the chapter will set the background by analysing the general positive obligations of the state in the context of freedom of expression as found by various international bodies, most notably the ECtHR. The second part of the chapter will look at how the current regulation of social media and internet intermediaries impacts freedom of expression. Consequently, the third part will evaluate the requirements of the state in terms of its positive obligations and current regulation in the context of freedom of expression.
It must be noted that this chapter is not intended as a general overview of the definition and characteristics of positive obligations under human rights law or as exhaustive research on the rules governing the liability of social media platforms and internet intermediaries. It is rather a merging of both of these topics, drawing attention to problematic situations and indicating their possible solutions.
9.2 Freedom of Expression and Positive Obligations of the State
Almost all human rights protection documents recognise the positive obligation of the state to ensure the protection of the rights enshrined in each particular document. For example, Article 1 of the European Convention on Human Rights (ECHR) provides that the Member States have a duty to secure for everyone within their jurisdiction the rights and freedoms defined therein. In addition, Article 13 of the ECHR guarantees the availability, at the national level, of a remedy to enforce the substance of ECHR rights and freedoms in whatever form they might happen to be secured in the domestic legal order. This requires the provision of a domestic remedy to deal with the substance of a complaint under the ECHR and to grant appropriate relief.Footnote 2 Therefore, states have a positive obligation to investigate allegations of a human rights infringement. The procedures followed must enable the competent body to decide on the merits of the complaint of the violation of the Convention and to sanction any violation found but also to guarantee the execution of decisions taken.Footnote 3
Similarly, under Article 2 (3) of the International Covenant on Civil and Political Rights (ICCPR), state parties must ensure that persons whose rights under the Covenant have been violated have an effective remedy.Footnote 4 The Charter of Fundamental Rights of the European Union (CFREU) contains a rule that is comparable to the previously mentioned Article in the ICCPR and the ECHR. Namely Article 51 of the CFREU requires EU institutions and the Member States to ‘respect the rights, observe the principles and promote the application thereof’. At the same time, it must be noted that obligations contained in the CFREU have some limitations. The same article prescribes that this obligation is applicable only as far as EU law is being implemented and does not extend the field of the application of EU law.
It is the ECtHR that can be thought of as the most prominent advocate of imposing positive obligations on the state to protect particular human rights. In addition, it holds that although the essential object of many provisions of the ECHR is to protect the individual against arbitrary interference by public authorities, there may in addition be positive obligations inherent in an effective respect of the rights concerned. The Court has emphasised that the effective exercise of certain freedoms does not depend merely on the state’s duty not to interfere, but may require positive measures of protection.Footnote 5 Although the ECtHR has not provided a general definition of the concept of positive obligation, from its case law it can be deduced that the prime characteristic of positive obligations is that they require in practice national authorities to take the necessary measures to safeguard a right.Footnote 6
The positive obligations of the state can be divided into several groups. First, there are substantive positive obligations and procedural positive obligations. Second, there are positive obligations of a vertical kind or those that protect the individual from the state and positive obligations of a horizontal kind or those that protect individuals against other individuals. Third, there are positive obligations that relate to the legal and administrative frameworks and those that encompass more practical measures that states need to take.Footnote 7
These positive obligations can be found in relation to almost every human right laid down in the Convention, including freedom of expression. As the ECtHR has put it: ‘Genuine, effective exercise of this freedom does not depend merely on the State’s duty not to interfere, but may require positive measures of protection, even in the sphere of relations between individuals.’Footnote 8 This is because of the key importance of freedom of expression as one of the preconditions for a functioning democracy; therefore, states must ensure that private individuals can effectively exercise the right of communication between themselves.Footnote 9
In deciding whether a positive obligation relating to the freedom of expression exists, the ECtHR has emphasised that there must be regard for the kind of rights of expression at stake, their capability to contribute to public debate, the nature and scope of restrictions on expression rights, the ability of alternative venues for expression, and the weight of the countervailing rights of others or the public.Footnote 10 It should be noted, however, that the Court itself looks at these criteria cumulatively in each particular case, attributing more or less weight to any one or set of these criteria, depending on the circumstances.
The ECtHR has emphasised several positive obligations of the state in the context of freedom of expression. For example, the state has to protect the right to freedom of expression by ensuring a reasonable opportunity to exercise the right of reply and an opportunity to contest a newspaper’s refusal in suing for a right to reply in the courts.Footnote 11 It has also recognised a rather broad obligation to create a favourable environment for participation in public debate for all persons concerned, enabling them to express their opinions and ideas without fear.Footnote 12 Moreover, the state has a positive obligation to protect speakers, especially journalists,Footnote 13 from physical attacks from other individuals in connection with the exercise of its freedom of expression.Footnote 14 Recognising that there can be no democracy without pluralism,Footnote 15 in the context of access to the broadcast market, the ECtHR has emphasised that states must put in place an appropriate legislative and administrative framework to guarantee effective pluralism.Footnote 16
Other human rights protection bodies have also recognised that states not only have a duty to refrain from limiting the right to freedom of expression, but also have a positive obligation to protect and guarantee it. Although not explicitly recognising the concept of positive obligations under Article 11 of the CFREU, the Court of Justice of the EU has argued that it must be possible for national courts to check that interference of the information rights of internet users is justified.Footnote 17
Furthermore, the United Nations (UN) Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression in his report has emphasised that governments must adopt and implement laws and policies that protect private development and the provision of technical measures, products, and services that advance freedom of expression.Footnote 18
Therefore, it can be argued that freedom of expression not only prohibits unjustified interferences from states, but may also require an active participation on behalf of them. At the same time, the positive obligations of the state should not be understood too broadly. They have been recognised as applicable to a limited number of situations touching upon the very essence of the freedom of expression. In addition, they are more often than not formulated in a manner that provides states with a rather wide margin of appreciation.
In the second part of this chapter, I will analyse whether the existing rules on regulating social media platforms and internet intermediaries in general ensure the positive obligations of the state in protecting and ensuring freedom of expression in the online environment.
9.3 Regulation of Social Media Platforms and Internet Intermediaries: A Threat to Freedom of Expression?
There are several types of actors participating in the flow of information in the online environment. First, there are internet users – natural persons – who access information stored online. Sometimes these also share information, thereby in essence becoming content producers, which are the second category of online actors. Possibly the most complex of the three is the third category – the internet intermediaries, including social media platforms. They differ from content producers as those individuals or organisations who are responsible for producing information in the first place and posting it online.Footnote 19 Internet intermediaries give access to, host, transmit, and index content originated by third parties or provide internet-based services to third parties.Footnote 20 The means by which they do this is almost without exception technical, meaning that the intermediaries are generally not aware of the content of the information they process and provide accessibility to. Such a role is unprecedented in the offline environment. Exactly because of this complexity, the regulation of internet intermediaries, including social media platforms, and their impact on freedom of expression is one of the focal points of this chapter.
The general and so far traditional rules of liability for internet intermediaries in the EU are set out in the E-Commerce Directive and recently in the Digital Services Act (DSA). The Directive exempts intermediaries from liability for the content they manage if they fulfil certain conditions. First, the service providers hosting illegal content need to remove it or disable access to it as fast as possible once they are aware of the illegal nature of the content. Second, only services that play a neutral, merely technical and passive role towards the hosted content are covered by the liability exemption.Footnote 21 For the Directive to become effective, it was transposed by the EU Member States into their national laws. Therefore, the E-Commerce Directive laid the groundwork for a notice-and-take-down procedure but did not provide any additional guidelines with regard to its implementation. Instead, the Directive left the subject matter to the discretion of the Member States.Footnote 22
Certain Member States have developed more detailed, formal notice-and-take-down procedures. Possibly the most known of these is the German Network Enforcement Act, which was adopted in 2017. Among other rules, it orders big social networks to remove or block access to content that is manifestly unlawful within twenty-four hours of receiving a complaint about it.Footnote 23 Administrative penalties of up to €5 million can be applied in cases where the social network fails to comply with the rules set out in the Network Enforcement Act.
The majority of the Member States opted for a verbatim transposition of the Directive, resulting in a lack of any firm safeguards for the content removal procedures in most EU countries.Footnote 24 For example, the Latvian Law on Information Society Services in Article 10 provides rules for the liability of an intermediary service provider. Nevertheless, like the E-Commerce Directive, it does not specify what amounts to actual knowledge or immediate action of the service provider.
Recognising these issues and the need to harmonise intermediary liability rules throughout the EU, the Commission of the EU has proposed new rules for digital platforms.Footnote 25 The Regulation of the European Parliament and of the Council on a Single Market For Digital Services (DSA) amending the E-Commerce Directive, as stated by the Commission, strives to maintain the core principles of the E-Commerce Directive and at the same time provide more protection for fundamental rights in the online environment, as well as online anonymity wherever technically possible.Footnote 26 Although the proposed regulation clarifies some aspects that were previously unclear under the guidance of the E-Commerce Directive, the general rules on intermediary liability have been maintained.Footnote 27 For instance, the DSA maintains the exemption of online platforms from liability provided the social media platform was not actively involved in the transmission or took action to delete the illegal information upon obtaining knowledge. However, one noticeable distinction in regard to liability rules is the fact that the DSA further develops the due diligence obligations applicable to social media platforms and includes new rules related to illegal content, content moderation, and algorithm oversight. While such due diligence obligations are dependent on the role, size, and impact of social media platforms, the fines for non-compliance are high and can reach a maximum of 6 per cent of a company’s annual worldwide turnover.Footnote 28
Other pieces of EU legislation also impose an obligation on intermediaries deciding on which content should and which should not be available online. For example, the Audiovisual Media Services Directive promotes co-regulation and self-regulation.Footnote 29 It also imposes duties on video sharing platforms to eliminate such harmful content from their platforms as incitement to violence and hatred.Footnote 30 Under the General Data Protection Regulation, internet search engines are required to balance freedom of expression and privacy rights in applying the right to be forgotten.Footnote 31 The Code of Conduct on Countering Illegal Hate Speech Online announced by the Commission in May 2016 incentivises information technology companies to tackle hate speech online on their own initiative.Footnote 32 Therefore, for now, and it seems also for the foreseeable future, the balancing of various interests at stake is delegated to internet intermediaries.
Therefore, essentially it is the internet intermediaries, not the judiciary of states, that must take decisions on balancing freedom of expression, privacy, and other rights online. Although such a mechanism comes with some bonuses, such as speedy procedures and reduction of workload for the courts, there are several issues that arise from it, most visibly the threats to the freedom of expression of users of social media platforms and access to information removed by intermediaries by the public at large. Furthermore, after social media platforms imposed indefinite suspensions of former US President Trump, one can add prior censorship concerns. The one issue that is closely related to the positive obligations of the state to ensure protection mechanisms for freedom of expression is that in general this type of intermediary liability regime puts private intermediaries in the position of having to make decisions about the lawfulness or unlawfulness of the content and create incentives for private censorship.Footnote 33 Several international organisations,Footnote 34 and other stakeholders,Footnote 35 have pointed to the fact that intermediaries should not be expected to conduct a quasi-adjudicatory exercise that weighs the rights of their users. They have argued that the fact that intermediaries have the technical means to prevent access to content does not mean that they are the best placed to evaluate the legality of the content in question, and whether measures affecting fundamental rights should be applied by an independent court rather than by private bodies.Footnote 36 Intermediaries are commercial entities whose fear of potential liability, or lack of resources to fully address requests for removal of information, may motivate an overzealous response to individual requests that information be delisted.Footnote 37 As private actors, intermediaries are not necessarily going to consider the value of freedom of expression when making decisions about content created by third parties for which they might be held liable.Footnote 38
Another criticism of entrusting the balancing of fundamental rights to internet intermediaries is that by enlisting internet intermediaries as watchdogs, governments delegate online enforcement to algorithmic tools with limited or no accountability. Due process and fundamental guarantees are mauled by technological enforcement, curbing fair uses of content online and silencing speech according to the mainstream ethical discourse.Footnote 39 In addition, until the obligations contained in the DSA are implemented, the algorithms in most cases are known to the intermediary alone or are even considered to be a commercial secret. Such a lack of transparency in the intermediaries’ decision-making processes can obscure discriminatory practices or political pressure affecting the companies’ decisions. Consequently, transferring regulation and adjudication of rights to freedom on the internet to private actors does jeopardise fundamental rights in general – such as freedom of information, freedom of expression, and freedom of business – by limiting access to information, causing chilling effects, or curbing due process.Footnote 40 More particularly and closer to the topic of this chapter, it also influences the capacity of the states to carry out their positive obligations.
In the first part of this chapter, it was concluded that states must create a favourable environment for participation in public debate, protect speakers, and put in place an appropriate legislative and administrative framework to guarantee effective pluralism. We must remember that the ECtHR has recognised the state as an object of these duties, and human rights law does not, as a general matter, directly govern the activities or responsibilities of private business.Footnote 41 Yet, as emphasised earlier, the EU rules on intermediary liability follow different logics.
Since internet intermediaries through advertising or the payment of subscription fees benefit from disseminating third-party content online, it seems only fair that they should bear responsibility for preventing access to illegal or harmful material.Footnote 42 At the same time, while seeking the most effective enforcement mechanism, the regulation introduced by EU law and transposed into national law, shifts the balancing duty from the state to private companies – social media platforms. That means that not the states, but online platforms are the ones creating the environment for participation in public debate. Until recently, the limited potential to review decisions, often made using algorithms, is inevitably linked with limited possibilities to ensure the protection of speakers. It also involves very few options for influencing pluralism in the online environment, which is becoming the principal source of information and communication in many countries.
Therefore, although the intermediary liability regime itself does not interfere with the freedom of expression, it creates a situation of horizontal interference resulting from a failure of the legislature to effectively protect the right to freedom of expression – a form of ‘State interference by proxy’.Footnote 43 The measures introduced by the DSA, which focus on ensuring more transparency and better protection of citizens’ fundamental rights online, including the obligation to provide information to users and establish a complaint and redress mechanism, is a huge step in the right direction. However, these steps should not be considered a substitute for the positive obligations of the state arising from the human rights standards set by the ECtHR and other international bodies.
9.4 Safeguards for Freedom of Expression
As just concluded, states have an obligation to effectively protect human rights from interference by other private individuals. Although not specifically mentioning the positive obligations of the state, the ECtHR has already argued that the state should protect freedom of expression online, not only by avoiding any limitations to it, but also by creating an appropriate legal framework. In Ahmet Yıldırım v. Turkey, one of the most important cases dealing with the accessibility of information online,Footnote 44 it indicated that Article 10 of the ECHR requires a law to provide safeguards that are intended to protect against the over-removal of information from the internet.Footnote 45
In addition, according to the UN Guiding Principles on Business and Human Rights, which requires businesses to avoid causing or contributing to adverse impacts on human rights, the duty to protect and to provide access to an effective remedy for violations of human rights is essentially incumbent on states.Footnote 46 Therefore, even though the document emphasises the duties of private stakeholders, it still underlines the role of the state in the protection of human rights.
Arguments supporting the positive obligations of the state in the context of freedom of expression and intermediary liability can also be found in legal doctrine. These arguments are based upon the idea that as many powerful social media platforms have become central to communication and information exchange, the legal framework in which they exist must be compatible with human rights standards.Footnote 47 Therefore, the content removal mechanisms should have a sufficient basis in law. To meet this requirement, the legislature should introduce specific legal provisions to clarify removal procedures. Legislation providing for content removal procedures should meet the requirement of ‘quality’. This means that rules should be clear and sufficiently precise for those subject to them to foresee the consequences and adjust their behaviour accordingly.Footnote 48
The positive obligations of states to protect freedom of expression are perhaps of even more relevance if such interferences are accepted, or even encouraged by the states,Footnote 49 as is the case in the rules dealing with intermediary liability. The current intermediary liability regime, the most notable documents of which are the E-Commerce Directive and the DSA, includes (especially in the latter) some safeguards that could ensure the protection of the right to freedom of expression.Footnote 50 But its application across Europe is yet to be seen and it does not relieve the state as an ultimate actor to ensure fundamental rights also online. As stated earlier, the doctrine of the positive obligations of states in the context of freedom of expression may provide for the further legal protection of the users of social media platforms, their right to express themselves, and their right to have access to information.
9.5 Effective Remedies and Procedural Safeguards
When freedom of expression is violated, appropriate remedies to this situation may include access to information about the violation and grievance mechanisms.Footnote 51 In order for these safeguards for freedom of expression to be effective, the states should ensure that the principles of due process and access to independent and accountable redress mechanisms are respected in the application of them.Footnote 52
First of all, users should be provided with the right to learn about the removal of the information they have published online.Footnote 53 Otherwise, they practically cannot protect their right to freedom of expression.Footnote 54
Although there is a chance that such a practice could backfire and once removed information could resurface again through different channels, a notification to webmasters and content providers should be issued by internet intermediaries whenever they restrict access to information created by these actors. Such an approach is supported, for example, by the UN Special Rapporteur on the promotion and protection of freedom of opinion and expression.Footnote 55 It is also prevalent in the doctrine applied by the courts in cases concerning intellectual property rights in the US and Brazil.Footnote 56 The notification should, as far as possible, include the reasoning of the decision that, if necessary, can be later challenged. Even more, in accordance with the Manila Principles of intermediary liability, before such a removal becomes permanent, the intermediary should weigh the arguments of the author of the information in question.Footnote 57 The DSA apparently ensures this remedy by introducing the concept of ‘notice and action’ and obliging social media platforms to inform users of the removal of their information. Consequently, this rectifies the existing lacuna in the EU legal framework. However, the DSA only applies as far as to intermediaries who offer their services in the EU single market.
Second, users who are notified by the service provider that their content has been flagged as unlawful should have the option of challenging the block or filtering of their content and to seek clarifications and remedies.Footnote 58 The possibility to appeal could come in various forms – using procedures provided by the intermediary or by a competent judicial authority.Footnote 59
The DSA notes that first and foremost the internet intermediaries themselves should have a review procedure in place. Such an idea is nothing new and some intermediaries have worked on this already. In 2018, Facebook already announced that it will create an independent oversight body to adjudicate appeals on content moderation issues.Footnote 60 The Board reviews a select number of highly emblematic cases and determines if decisions were made in accordance with Facebook’s stated values and policies.Footnote 61 Individual users can bring appeals to the Board, and Facebook as a company is able to refer cases for expedited review if they could have urgent real-world consequences.Footnote 62 Most notably the Oversight Board accepted a case referral from Facebook to examine their decision to indefinitely suspend former US President Donald Trump’s access to post content on Facebook and Instagram. Facebook has also requested policy recommendations from the Board on suspensions when the user is a political leader.Footnote 63
At the same time, the opportunities to contest the decisions of online platforms should complement, yet leave unaffected in all respects, the possibility to seek judicial redress.Footnote 64 There should always exist the option of judicial redress to ensure effective legal protection of the right to freedom of expression.Footnote 65 Indeed, the safeguards would become redundant if there were no option to receive a judicial overview of the decisions made by internet intermediaries. The liability placed upon private companies to remove third-party content without judicial oversight would not be compatible with international human rights law and freedom of expression specifically.Footnote 66
Although internet intermediaries are technically the best placed to evaluate applications for content removal and act upon them, such a mechanism might not be completely compatible with the requirements of legality and quality of law. While the practice of implementing the transparency rules provided by the DSA on online platforms is unclear, so far the internal decision-making procedures of the intermediaries are not always very transparent. Therefore, removal orders issued by independent and impartial bodies provide a much greater degree of legal certainty.Footnote 67
Third, the independence of internal complaint and redress mechanisms and the readiness of social media platforms to follow the decisions of oversight boards depend also on the goodwill of the platforms. The change in ownership in 2022 of one of the biggest social media platforms, Twitter, illustrates how rapidly the existing policies of the social media platform can change and how fragile reliance only on internal redress mechanisms created by the platform can be. A court or similar authority would operate with greater safeguards for independence, autonomy, and impartiality,Footnote 68 and would have greater capacity to evaluate the rights at stake and offer the necessary assurances to the user.Footnote 69 Such authorities as public bodies (not private) would also be better placed to make the determination of whether a particular content is illegal, which requires careful balancing of competing interests and consideration of defences.Footnote 70
The positive obligation of states to protect freedom of expression entails the creation of a system that would allow individuals and other content creators to protect their freedom of expression also in case it is exercised through social media platforms and internet intermediaries. These intermediaries should abide by the rules of freedom of expression themselves, but at the end of the day, the state cannot rely on these private players to protect speech online. The DSA envisages a new enforcement mechanism, which will complement the internal complaint and redress tools established by social media platforms. This mechanism will consist of the Commission and independent national authorities to supervise how online intermediaries, including social media platforms, adapt their systems to new requirements. To safeguard the freedom of expression, the independence of these bodies and the possibility of challenging their decisions in the courts are of crucial importance. The criticism so far has been correctly expressed that the oversight of very large platforms is subject to the European Commission, which is not an independent regulator but the executive arm of the EU.Footnote 71
9.6 Rules on Balancing
At this point, it is clear that monitoring and restricting access to information online would be impossible without the involvement of internet intermediaries. They act as gatekeepers with direct access to the keys needed to close the gates to malicious or otherwise illegal information. The keys available to states are much more robust and take longer to activate. This means that to ensure effective protection of human rights online, the states have to cooperate with internet intermediaries. Therefore, besides the creation of a mechanism for protecting freedom of expression online, states have to set clear rules for internet intermediaries to follow in deciding which information remains accessible and which should be removed. The importance of clear rules at the national level is especially relevant, since only some illegal content is defined at the EU level, while some content might be found illegal and subject to removal only in certain states.
Internet users should always be able to understand why certain information has been removed.Footnote 72 The changes introduced by the DSA apparently ensure that social media platforms notify users of the reasons for removing their content and the option to contest such decisions.
The states should ensure that removal of information or blocking access should be undertaken in the observance of the principle of freedom of expression.Footnote 73 In mediating the public interest and individual rights online, a delicate regulatory balance is required.Footnote 74 Therefore, the state should provide guidance for intermediaries on how to achieve this.Footnote 75 The DSA could play a role in establishing standards and best practices in this area as it tries to balance measures imposed on platforms to remove harmful information against the restrictions to freedom of expression. Yet the application of the DSA both at the EU and national level is yet to be seen.
9.7 Conclusions
Social media platforms and internet intermediaries in general are becoming, and in many societies already are, the dominant actors in providing an environment for freedom of expression and communication. Yet the legal regulation of platforms has not always followed technological developments and changes in society so as to ensure respect for freedom of expression and other fundamental rights. The idea during the emergence of the online environment that it should be left free and unregulated to ensure the marketplace of ideas was gradually changed with increasing regulation from the states in order to protect other fundamental rights, for instance human dignity, and ensure non-discrimination. The signs of this shifting attitude of states and international bodies was evidenced by the judgement of the ECtHR in the case of Delfi v Estonia,Footnote 76 which for the first time imposed the liability on a news platform for not removing derogatory comments created by its users in a timely manner. The Court came to such a decision despite the fact that the platform acted quickly after the notice was received. This judgment was followed by laws restricting freedom of expression and imposing more obligations on social medial platforms in a number of European countries,Footnote 77 and even debates in the US about changes to section 230 of the Communications Act of 1934, which grants websites legal immunity for much of the content posted by their users.Footnote 78
While such moves by states are understandable in order to protect other fundamental rights and currently provide measures against the spread of misinformation and harmful content in the online environment, the safeguards for freedom of expression seems to have been forgotten in this process. Online intermediaries, which have implemented different technical tools to block and to remove information in response to demands by states, are actually now in charge of setting the boundaries of freedom of expression – a role that should be performed by states. The role of freedom of expression in democracies is essential, and states should not shirk their obligations in ensuring freedom of expression and deciding about its limits by outsourcing this function to private corporations. Social media platforms and internet intermediaries in general have neither democratic legitimacy nor do they in many cases share the goal of protecting this fundamental right, and the process of limiting expressions is in most cases accompanied by a lack of clear rules, transparency, and access to remedies. We must remember that the right to freedom of expression entails not only the negative obligations of the state, but also positive ones. The doctrine of these positive duties is still developing and awaits clarification, especially in the context of information published online. However, it can be argued that it can already be applied in order to improve the safeguards for freedom of expression for users of social media platforms, and discourage the platforms from the over-removal of content to avoid liability. The DSA is certainly a step forward; however, its scope is limited and its application is yet to be seen.
Last but not least, the majority of existing legal standards focus on the development of content moderation rules, which would balance freedom of expression and the removal of illegal content by social media platforms. Still, another and similarly important threat to freedom of expression in the online environment is the fact that the vast majority of public discourse takes place on a very small number of platforms, which hold excessive power over information flow.Footnote 79 As stated earlier, the ECtHR has emphasised that states must put in place an appropriate legislative and administrative framework to guarantee effective pluralism in the media ecosystem.Footnote 80 Therefore, states are obliged to decentralise the channels of public discourse and develop rules that favour the development of new social media platforms.