I. Introduction
In an effort to combat harmful content online and shift the power imbalance in speech governance, the European Union adopted Regulation 2022/2065, the Digital Services Act (“DSA”).Footnote 1 The DSA marks a watershed moment in platform governance, adding a significant accountability and transparency framework onto the existing liability framework of the e-Commerce Directive.Footnote 2 The DSA promises to ensure a safe, predictable and trusted online environment in which fundamental rights enshrined in the Charter of Fundamental Rights in the European Union (“CFR”) are better protected.Footnote 3 The DSA creates a layered approach to platform obligations, increasing duties as the size of the platform increases. Platforms with the most reach, and therefore the most potential for harms, have the most obligations under the DSA.Footnote 4 Very Large Online platforms and Very Large Online Search Engines (“VLOPSEs,” platforms with a number of monthly active users exceeding 45 million)Footnote 5 need to abide by a range of risk-mitigating and transparency-building measures,Footnote 6 whereas micro- and small enterprises are excluded from the scope of a number of DSA provisions.Footnote 7
In addition to creating a multi-layered regime for platform obligations, the DSA introduces a new administrative authority at the national level, tasked with overseeing DSA compliance: the Digital Services Coordinator (“DSC”).Footnote 8 The DSC has a two-dimensional relationship with the right to freedom of expression of European internet users in enforcing the DSA. The first dimension is that DSCs can shape the freedom of expression of internet users through drafting norms for content moderation practices and empowering civil society actors to influence the content moderation process along the lines of national laws and speech traditions; although there is a common core of freedom of expression in Europe, there are regional differences. The second dimension is that DSCs protect the right to freedom of expression of internet users by acting as a complaint platform for internet users, and by sanctioning platforms for non-compliance. Their role is therefore also to rebalance the power in the online sphere between internet users, providers of intermediary services, and state actors.Footnote 9 If enforcement protects the freedom of expression of users, differences in enforcement can lead to different levels of protection of that freedom for internet users in the EU.
This article explores the role of the DSC in the DSA in light of its two-dimensional impact on the freedom of expression of internet users in the EU. It asserts that, while freedom of expression of internet users is anchored to an extent in DSA enforcement, there is a seeming disconnect between the impact DSCs can have on freedom of expression and the safeguards around their position. To this end, it is structured as follows: the first section describes the position of the DSC in the DSA and how its powers interact with the freedom of expression of internet users. The second section argues that the decentralised enforcement structure of the DSA affects the protection of freedom of expression of internet users. The third section makes suggestions as to how freedom of expression can be better safeguarded in the DSCs role in enforcing the DSA.
II. The digital services coordinator
This section dissects the role of the DSC in the DSA, and outlines how its powers interact with the freedom of expression of internet users. Subsection 1 describes the role, competence, and institutional embedding of DSCs, Subsection 2 provides a brief introduction into the legal framework for freedom of expression, Subsection 3 explores how the DSCs’ powers interact with freedom of expression, and finally Subsection 4 provides an interim conclusion.
1. DSCs as an institution and their competence
DSCs are national administrative authorities tasked with enforcing the DSA in a Member State.Footnote 10 Member States may choose to create a new enforcement authority to fulfil the role of DSC, but can also appoint an existing authority, or even divide the enforcement tasks over multiple existing authorities under one “coordinating” DSC.Footnote 11 DSCs – as decentralFootnote 12 enforcers of EU law – are embedded in the procedures of the DSA, but also in national administrative law, which lays down the procedural framework around its supervisory tasks and powers. This contribution touches upon that framework, but predominantly from the perspective of the DSA.Footnote 13 Currently, Member States have appointed existing consumer-, media-, and telecom-protection authorities as DSC. DSCs must perform their DSA tasks in an impartial and timely manner.Footnote 14 Member States shall provide them with sufficient technical, financial, and human resources to fulfil their oversight- and enforcement tasks.Footnote 15 DSCs must fulfil their tasks with complete independence, without prejudice to judicial review or budgetary control.Footnote 16 The principle of independence in this regard is contentious. Independence of DSCs could be necessary since they concern themselves with freedom of expression on the internetFootnote 17 ; independence limits the procedural autonomy of Member States, however, and in the case of the DSA is not derived from primary law.Footnote 18 Independence in the DSA is therefore “rather thin.”Footnote 19 The possibility of appointing different, existing, actors, the oversight role of the Commission (which is inherently not an independent authorityFootnote 20 ), and the various mandatory cooperation mechanisms all limit the supposed required of the DSC. An independent EU agency for oversight was proposed, but eventually rejected because “it stood in the way of expediency of the legislative process;”Footnote 21 i.e., harmonisation on aspects of enforcement was not desired at this stage.
DSCs are responsible for the oversight of providers of intermediary services with their main establishment in that Member State.Footnote 22 DSCs have competence when an intermediary service provider located in their jurisdiction violates the DSA, when an inhabitant of that member state complains about non-compliance of providers of intermediary services under the DSA, and when it receives complaints from another jurisdiction about an intermediary service provider located in its jurisdiction. DSCs rely on cooperation with other DSCs when enforcing against providers of intermediary services outside of their jurisdiction.Footnote 23 Competence as regulated in the DSA is without prejudice to Union law and national rules on private international law, which can aid in determining which DSC is relevant in a particular case.Footnote 24 However, the cross-border nature of the internet and the issues addressed in the DSA require an EU-wide approach.Footnote 25 The European Board of Digital Services is formed to streamline that cooperation.Footnote 26 The Board consists of high-level officials from each Member State DSC, and is chaired by the European Commission.Footnote 27 It contributes to a consistent application of the DSA across the European Union – a feature discussed more extensively in Section III.
Decentralised enforcement is a deliberate choice: it ensures that procedures take place at close proximity to citizens of Member States, and avoids unnecessary extension of the Commission’s competence. That said, the EU regulator codified an exception to this: VLOPSEs. VLOPSEs are challenging to oversee due to their size and limited enforcement power at a national level.Footnote 28 Further, the EU regulator may have feared regulatory capture if VLOPSE enforcement was located in a single Member State, potentially endangering the required harmonisation and cooperation necessary to ensure a safe online environment.Footnote 29 After a VLOPSE is designated as such, DSCs share competence with the Commission.Footnote 30 The Commission mitigates potential large-scale harms within the EU arising from VLOPSEs by (partly) enforcing the DSA for intermediary service providers with more than 45 million monthly active users.Footnote 31 The Commission has exclusive competence over obligations laid down in Chapter III Section 5, containing predominantly transparency-and risk-mitigation obligations.Footnote 32 The Commission may also enforce other obligations under the DSA against VLOPs and VLOSEs when it deems appropriateFootnote 33 ; this could be in the case of systemic infringement of the DSA that seriously affects recipients in a Member State.Footnote 34 Once the Commission has initiated proceedings under article 65(1), Member States can no longer start proceedings on the same infringement, to avoid ne bis in idem.Footnote 35
At first glance, this reduces the impact of DSCs on freedom of expression; after all, most content is disseminated through VLOPSEs, for which competence is partially vertically centralised with the Commission. However, the DSC Database maintained by European Digital Rights shows that nearly all DSCs have received complaints under Article 53 DSA for enforcement against online platforms, and the Irish DSC has requested information from twelve intermediary service providers.Footnote 36 In cases of complaints, Article 53 does not provide the possibility submitting a complaint with the Commission,Footnote 37 only to the DSC of establishment; therefore, for all cases outside of Chapter III Section V, DSCs are the enforcing actor toward VLOPSEs for individual user complaints.Footnote 38 The fact that for all obligations outside of Chapter III Section V the Commission’s enforcement role is limited procedurally, due to the required appropriateness for its involvement, as well as substantively, being that provider oversight in many cases requires significant knowledge of the impact of services on society and significant “political-cultural know-how,”Footnote 39 supports the role of the DSC as a key actor in DSA enforcement. Recent cases, such as the Belgian investigation into Telegram,Footnote 40 a messaging service noteworthy for its spreading of illegal content that does not yet qualify as a VLOP, or Twitch, a streaming service located in Luxembourg associated with livestreaming public shootings, show that enforcement on the national level by DSCs is highly relevant for platforms just below the VLOPSE threshold.Footnote 41 Conversely, inquiries fuelled by the Romanian DSC into the role of TikTok in Romanian elections illustrate the relevance of DSCs as a signalling entity, even when it comes to enforcement against VLOPSEs: the Romanian elections were annulled because of alleged interference through TikTok.Footnote 42 This resulted in the Romanian DSC requesting the blocking of TikTok in Romania,Footnote 43 and the European Commission launching an official investigation.Footnote 44 Individual cases and the Commission’s significant, but limited competence, underscore the need for freedom of expression safeguards around the role of DSCs as national enforcers of the DSA.
2. Freedom of expression
EU citizens enjoy a similar right to freedom of expression under Article 11 CFRFootnote 45 on the internet as they do in the “offline world,” with the notable distinction that, because of the nature of the internet, harms can be exacerbated by increased reach and longer availability.Footnote 46 Correspondingly, the right to freedom of expression requires speakers to take into account the wide audience they reach by communicating online: freedom of expression on the internet is a complex balancing act between the rights of the speaker on the internet and the societal interests concerned with being exposed to that speech on the internet.Footnote 47 Aside from an active right to freedom of expression, Article 11 CFR also encompasses a “passive” right to receive information.Footnote 48 This also encompasses right to access the internet.Footnote 49 As such, interferences with access to (parts of) the internet must be balanced with the freedom to access information.Footnote 50
The scope of freedom of expression on the internet is determined predominantly through content moderation. Content moderation is the process by which providers of intermediary services determine what can be expressed and seen on the internet through removing content, amplification and de-amplification, and design of affordances.Footnote 51 The process involves a network of stakeholders affecting content moderation, such as civil society, users and government officials.Footnote 52 This process is primarily governed by the intermediary service provider, creating a complicated landscape in which private actors determine the scope of public values such as freedom of expression.Footnote 53 This creates a tension: fundamental rights norms, such as Article 11 CFR, are primarily addressed at the state or supranational organisations such as the EU; providers of intermediary services, as private parties, are not directly bound by Article 11 CFR.Footnote 54 Providers of intermediary services may be subjected to a horizontal effect of freedom of expression, depending on the national jurisdiction,Footnote 55 and can be affected by fundamental rights norms in the interpretation of private law, but such approaches are not clear-cut and differ per member state.Footnote 56 Similarly, the DSA, despite being inherently a market harmonisation instrument,Footnote 57 creates some fundamental rights obligations to providers of intermediary services.Footnote 58 Some scholars argue that the nature of individual fundamental rights is less compatible with the scale of content moderation,Footnote 59 and that application of fundamental rights to the digital realm in its current form requires rethinking.Footnote 60 Until such time, the realisation of fundamental rights in the digital sphere is dependent on platforms moderating content, and how regulators require platforms to include fundamental rights considerations in the content moderation process. DSCs affect freedom of expression against this backgroundFootnote 61 : they enforce a regulation that directly affects the content moderation process, which invariably has effects on the freedom of expression of internet users. In their enforcement, they are required to take freedom of expression into account, both by application of the Charter and indirectly by the DSA.Footnote 62
3. DSCs interacting with freedom of expression
DSCs, in their role as enforcer of the DSA, have a number of tasks that affect the freedom of expression of internet users. As iterated above, DSC tasks are predominantly aimed at content moderation on a systemic level, focusing on procedural safeguards and risk mitigation, not on individual pieces of content.Footnote 63 Their competence can be divided in two themes: powers that directly affect content moderation (I.3.a & I.3.b), and powers that affect content moderation indirectly through investigating or sanctioning intermediary service providers (I.3.c & I.3.d).
a. Powers that affect the content moderation process
DSC may affect content moderation through Article 9 DSA, allowing “relevant national judicial or administrative authorities” to order the provider of intermediary services to act against illegal content.Footnote 64 DSCs explicitly fulfil a role as go-between for judicial or administrative authorities issuing orders to remove content or issuing orders to provide information; they must be informed by- and inform other DSCs of the content of such orders.Footnote 65 The institutional set-up explained in Section II.1 creates a peculiarity in this framework. The DSC itself qualifies as a relevant national authority, if appointed to this extent under national administrative law.Footnote 66 Aside from the DSC, it is possible that the authority hosting the DSC, e.g., the consumer authority of a member state, qualifies as a relevant national administrative authority. In the first instance, the interface with freedom of expression by the DSC is clear: if the DSC issues an order to act against illegal content and the provider of intermediary services follows up on that order by limiting access to the litigious content, the right to freedom of expression of the internet user is affected. In the second instance, if the hosting authority orders to act against illegal content, the DSC may still be involved, and therefore interacting with the right to freedom of expression of internet users. DSCs are currently part of oversight authorities that have tasks under national or EU law, which can qualify them as relevant national administrative authorities, allowing them to submit orders to act against illegal content or request information. This can cloud the oversight roles at one organisation. Ideally, oversight authorities separate their tasks: the requirement of independence for DSCs as laid down in Article 50(2) DSA requires that “competent authorities […] act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions […].”Footnote 67 However, there is no guidance on what safeguards around the dual functioning of DSCs should look like in practice, meaning that, as said above, independence is “rather thin.”Footnote 68 A consumer authority might issue an order to act against illegal content, but it would be unclear whether this is order originated from the DSC hosted at the consumer authority or the consumer authority itself. In any case, if the provider of intermediary services follows up on that order, the freedom of expression of the internet user is interfered with on behalf of the DSC or its hosting authority.
A second possible interference with freedom of expression is found in the explicit role for DSCs in formulating codes of conduct. Under Article 45, DSCs should encourage and facilitate the creation of codes of conduct, together with the Commission.Footnote 69 In encouraging or facilitating, DSCs can take varying roles which may affect the content of codes of conduct, for example by encouraging moderation of a specific type of content.Footnote 70 Although Codes of Conduct are a voluntary effort, there is significant informal pressure exuding from encouragement or facilitation for providers of intermediary services to partake, which may lead to cynicism on the presentation of state-led regulatory efforts as “voluntary.”Footnote 71 The “voluntary” nature of such codes, which is reinforced by non-participation in Codes of Conduct being interpreted as a contra-indicator for DSA compliance,Footnote 72 can also be interpreted as an interference with freedom of expression of internet users that lacks a legal basis if platforms remove content on the basis of Codes of Conduct that are not codified in law.Footnote 73
This reasoning also applies to the enforcement of Article 14 on terms and conditions and Article 16 on notice and action mechanisms. Article 14 poses a number of requirements for intermediary service providers regarding their terms and conditions, such as providing information on policies, procedures and tools used for content moderation, and stipulates that providers of intermediary services must act diligently, objectively and enforce their terms of services in a proportionate manner, with due regard for the freedom of expression and the pluralism of the media.Footnote 74 Terms and conditions are commonly used by platforms to form guidelines that shape speech on their platforms.Footnote 75 This is one of the first regulatory interventions in the drafting and enforcement of terms and conditions, and provides significant power for DSCs to intervene in private norm-setting, for example on moderating certain types of content under terms and conditions.Footnote 76 Quintais et al underline this operational concern and raise questions on the enforcement of Article 14(4) in light of how well fundamental rights align with large-scale content moderation, as touched upon in I.2.Footnote 77 The enforcement of the article can therefore have far-reaching consequences, since moderation on the basis of terms and conditions forms the overwhelming majority of all content moderation decisions.Footnote 78
How DSCs enforce or sanction compliance with article 16 forms a similar intervention in the content moderation process. Article 16 requires intermediary service providers to facilitate the submission of notices on illegal content. The mechanisms by which notices are shaped can affect the degree to which users report illegal content.Footnote 79 More pertinent for DSCs effect on freedom of expression of users is the enforcement of article 16(6).Footnote 80 Article 16(6) requires that intermediary service providers process notices in a timely, diligent, non-arbitrary, and objective manner.Footnote 81 The standards by which intermediary service providers moderate on the basis of notices determines the scope of freedom of expression of users whose content notices are submitted on. Enforcement aimed at the processing of such notices therefore shapes the right to freedom of expression online.
In a further intervention in the content moderation process, DSCs are tasked with certifying stakeholders in the content moderation process such as out-of-court dispute settlement bodies and trusted flaggers. If a user disagrees with the decision of an internal complaint-handling system on a complaint regarding a content moderation decision,Footnote 82 they have the right under the DSA to select any certified out-of-court dispute settlement body to appeal that decision.Footnote 83 Out-of-court dispute settlement bodies are independent entities certified by the DSC that decide on such matters in a fair, swift and cost-effective manner.Footnote 84 A well-known current example of such an entity is the Meta Oversight Board, which, although it has been criticised as a form of “governance-washing,” has a significant norm-setting power in issuing decisions, affecting Meta’s content moderation practices.Footnote 85
Out-of-court dispute settlement bodies under Article 21 may have a less profound impact due to the fact that their decisions are non-binding.Footnote 86 Their impact is also limited by the fact that internet users are free to choose the out-of-court dispute settlement body of their preference; users are free to avoid out-of-court dispute settlement bodies that do not align with their values.Footnote 87 However, the normative power that non-binding decisions exude over content moderation practices can shift existing policy in favour of avoiding future adverse decisions, in line with the reasoning of the “chilling effect” platform liability can have on freedom of expression.Footnote 88 Authors so far have different visions on the impact out-of-court dispute settlement bodies will have on content moderation and freedom of expression in general: Wimmers argues that certifying more entities to decide on content moderation leads to further fragmentation, which goes against the spirit of the DSA.Footnote 89 He further argues that “free speech disputes do not lend themselves to settlement by private bodies.”Footnote 90 Ortolani proposes a more nuanced “wait-and-see” approach with regards to out-of-court dispute settlement bodies.Footnote 91 Assuming that out-of-court dispute settlement bodies have a significant impact on content moderation because of the normative weight of their decisions recalibrating content moderation practices, a question rises about the DSC certifying such bodies: they have the power to accredit entities that affect content moderation, an impact that is felt Union-wide, making the selection and accreditation of these entities an important interface with user’s freedom of expression. The role of individual DSCs in this is crucial: if an out-of-court dispute settlement body fails to meet the requirements laid down in Article 21(3), only the certifying DSC can revoke that bodies’ status, limiting the safeguards for EU citizens outside that Member State that may be affected by the normative power exuding from an out-of-court dispute settlement bodies’ decisions.
The selection and certification of trusted flaggers presents a similar interface with freedom of expression. Trusted flaggers are state- and non-state entities that have priority access to the content moderation process, meaning that notices, “flags,”Footnote 92 they submit with the provider of intermediary services are treated expediently and with priority.Footnote 93 DSCs appoint trusted flaggers based on expertise, independence, diligence, accuracy and objectivity; priority of those flags is granted based on the experience such entities hold on a specific topic, usually leading to higher quality notices aiding online platforms in their content moderation process.Footnote 94 The status of trusted flagger can, especially when state entities act as trusted flaggers, exude pressure on providers of intermediary services to comply with removal requests.Footnote 95 This has a direct impact on the visibility of user-generated content. Trusted flaggers can report content on the basis of national or EU law following article 3(h), creating a risk for extraterritorial application of national law.Footnote 96 The fact that users do not have the freedom to select the trusted flaggers involved in their content moderation process exacerbates these concerns as opposed to the freedom they have in choosing out-of-court dispute settlement bodies. Selecting what entity takes part in the content moderation process requires DSCs to take into account the freedom of expression of internet users, or at least to consider the capacity of the trusted flagger to do so. However, similar to Article 21(3), the DSA does not provide clear guidance in this regard on Article 22(2), and the Commission is yet to exercise its power in issuing guidelines on the certification of trusted flaggers under Article 22(8).Footnote 97
b. The European board of digital services
DSCs cooperate across the EU in the European Board of Digital Services. The Board is tasked with harmonising DSA enforcement across the EU. On a general level, DSCs contribute in the Board to drafting guidelines on matters covered by the DSA.Footnote 98 Although the Commission has explicit competences to issue guidelines, its role in the Board is limited because it has no voting powers. DSCs, in their norm-setting role in the Board, influence how the DSA is enforced throughout the European Union. In this role, they need to cooperate with twenty-six other DSCs, but larger DSCs, hosting more online platforms, may have more influence in the setting of guidelines, since they oversee more and larger providers of intermediary services and have more resources to engage in norm-development.Footnote 99 They have, as such, a more prominent position in shaping freedom of expression online than other DSCs, a “Dublin-Effect.”Footnote 100
Aside from general guidelines on DSA enforcement that may affect content moderation, there is an explicit role for the Board in crisis situations. The Board makes recommendations to the Commission on triggering the crisis response mechanism laid down in Article 36(1). In such cases, VLOPSEs must enact crisis response mechanisms that include the assessment of whether the functioning of their services contributes to a serious threat to public security or public health.Footnote 101 VLOPSEs must then apply specific, effective, and proportionate measures to mitigate threats and report back on these measures with the Commission.Footnote 102 These measures can have a significant impact on content moderation in the European Union: case law of the European Court of Human Rights such as Cengiz et al v Turkey and Yilderim v Turkey underline that crises can interfere with freedom of expression on the internet.Footnote 103 Similarly, the COVID-19-crisis showed that internet services are key sources of information on public health, sparking a rich debate on content moderation, freedom of expression, and disinformation.Footnote 104 The power to issue recommendations on enacting crisis protocols therefore has a significant impact on the freedom of expression of internet users. The influence of a single DSC in the grander scheme of Article 36 may be limited: a recommendation is issued by the Board, not by a single DSC, and the Commission has the final say in declaring states of crisis.Footnote 105 However, if a single DSC were particularly influential due to their size and the number of providers of intermediary services they host (the “Dublin-effect”), they may inspire the Commission to declare a state of crisis. It may be difficult for smaller, less influential DSCs to acquire a similar status. In the recent case of the alleged interference of the Romanian elections via TikTok therein, it was difficult for the Romanian DSC to trigger a similar crisis situation for TikTok.Footnote 106
c. Powers related to investigating service providers
To enforce the DSA, DSCs are attributed a range of investigatory and sanctioning powers to relating to providers of intermediary services for non-compliance with the DSA. Part of these powers is aimed at creating knowledge about platform practices. Regulators suffer from a significant knowledge deficit, and the DSA provides a toolbox for overcoming that.Footnote 107 DSCs can request updated transparency reports from intermediary service providers,Footnote 108 and request access with the Commission to risk assessments of VLOPSEs established in its jurisdiction.Footnote 109 DSCs can also request access to data with VLOPSEs under Article 40(1) for the purpose of monitoring and assessing compliance with the DSA with a reasoned request.Footnote 110 DSCs vet researchers, meaning they issue data access requests on behalf of researchers seeking to study systemic risks (Article 34(1)) under Article 40(8).Footnote 111 In granting applications, DSCs have a role in shaping the interpretation of systemic risk, even though that falls under the Commissions competence.Footnote 112 The information gathered through these research and data access requests shapes the understanding and therefore the enforcement of the systemic risk assessment, mitigation and auditing framework.Footnote 113
Aside from requesting data access, it can request information from providers of intermediary services, as well as auditing organisations, including all members of staff.Footnote 114 It can further inspect the premises of providers of intermediary services and confiscate information relating to suspected infringement.Footnote 115 These are so-called “dawn-raids,” which are already prevalent in other fields of law, such as consumer protection and competition law. In those fields they are considered impactful measures from a procedural- and human rights perspective, underlining the DSCs significant investigatory powers in the DSA.Footnote 116
d. Powers related to sanctioning intermediary service providers
If investigatory powers indicate a violation of the DSA, the DSC can use its sanctioning powers to combat infringement. It can agree on binding commitments with providers aimed at ceasing the infringement,Footnote 117 or order cessation of the infringement under a periodic penalty.Footnote 118 It can also fine providers of intermediary servicesFootnote 119 or adopt interim measures that avoid the risk of serious harm arising from the infringement.Footnote 120 Penalties imposed on providers of intermediary services must be effective, proportionate and dissuasive; if sanctions do not lead to cessation of the infringement and the infringement causes serious harm that cannot be avoided, the DSC has the power to require the management body of the provider of intermediary services to examine the situation and adopt an action plan that must be carried out and reported upon.Footnote 121 If the infringement continues and entails a criminal offence and continues to involve a threat to the life or safety of people, the DSC can request a competent judicial authority in its Member State to temporarily restrict access to the service infringing on the DSA, or, if such is technically unfeasible, to the online interface of the provider of intermediary services.Footnote 122 The power to block access to platforms has been linked to freedom of expression infringements by the European Court of Human Rights, and underlines the enforcement power that DSCs hold.Footnote 123
e. Safeguards around DSA enforcement
The DSA must be enforced in light of the principle of proportionality: any action taken must be proportional to the nature, gravity, duration, recurrence of the infringement, without unduly restricting access to the service; measures must also take into account the technical- and operational capacity of the provider of the intermediary service concerned.Footnote 124 Furthermore, Member States need to lay down rules and procedures to ensure that the exercise of sanctioning powers is compliant with applicable national law, the Charter, and general principles of EU law.Footnote 125 Affected parties of DSA enforcement enjoy the right to an effective judicial remedy through a range of private- and public enforcement mechanisms.Footnote 126 Public enforcement mechanisms involve filing complaints with Digital Services Coordinators under Article 53 or access to national courts.Footnote 127
In terms of substantive fundamental rights, Article 51(6) emphasises the right to respect for private life and the rights of defence, including the right to be heard and access to the file.Footnote 128 Recital 153 further emphasises that the right to freedom of expression must also be taken into account in DSA enforcement.Footnote 129 Exactly how the right to freedom of expression is taken into account in the enforcement of the DSA is not clarified. DSCs likely have fundamental rights anchored in their national system as well, but scope of those fundamental rights may differ, as illustrated in Section III.2. This raises the question whether freedom of expression is sufficiently anchored in DSA enforcement.
4. Interim conclusion
Section II has outlined the various interfaces that the DSC’s role under the DSA has with the freedom of expression of internet users. DSCs hold significant power that intervene with the content moderation process, and as such the DSC has the potential to influence freedom of expression on the internet: a responsibility that requires considerable safeguards. These safeguards, outlined in I.3.e, are relatively limited. Under the DSA, the DSC also has the responsibility to protect the freedom of expression of internet users. In that protecting role, the decentralised network of enforcement of the DSA presents a challenge. This is explored in the next section.
III. Decentralised enforcement leading to different applications of the DSA
This section builds on the previous section by explaining how the decentralised enforcement framework in the DSA can fragment its application, which can diminish the protection of freedom of expression. Decentralised enforcement has benefits: enforcement takes place in close proximity to where the effects are felt, builds on “political and cultural know-how,” and Member States maintain administrative and procedural autonomy.Footnote 130 However, this article proposes that the decentralised structure of the DSA can lead to different levels of protection of freedom of expression, which is not sufficiently mitigated in DSA enforcement network. This is explained following an introduction of the DSA’s decentralised enforcement structure.
1. The DSA’s decentralised enforcement structure
The DSA’s enforcement is dictated by national administrative authorities and the European Commission, relying on the competence structure described in Section II.1.Footnote 131 Rademacher and Marsch describe the DSA enforcement network as both horizontally and vertically centralised.Footnote 132 DSA enforcement is centralised horizontally by a country-of-origin principle,Footnote 133 somewhat resembling the one-stop-shop found in the General Data Protection Regulation (“GDPR”).Footnote 134 That framework can be used as a frame of reference for decentralised enforcement; other fields of EU law, such as consumer or competition law, have a different competence structure.Footnote 135 DSCs have competence over the providers of intermediary services located in their jurisdiction, but if intermediary service providers fall outside of their competence, they must rely on other DSCs to investigate and enforce.
The DSA creates a legally binding cooperation mechanism that is distinct from the framework in the GDPR. Under Article 58(1), a DSC of destination (i.e., in which the intermediary service is delivered) can request the DSC of establishment to take investigatory and enforcement measures to ensure compliance with the DSA.Footnote 136 These requests require the DSC of establishment to provide an assessment of the suspected infringement and an explanation of any investigatory or enforcement measures taken within two months following the request.Footnote 137 If deemed insufficient, the Commission acts as a dispute resolution mechanism, issuing non-binding conclusions to the DSC of establishment indicating whether the investigatory or enforcement measures are sufficient.Footnote 138 Similarly, the DSC of establishment can request, or be recommended to, launch a joint investigation, enabling more enforcement capacity over complicated matters.Footnote 139 These avenues help overcome potentially inactive administrative authorities; without such cooperation mechanisms, activation would have to be realised through infringement procedures under Article 258 TFEU.
Decentralised enforcement structures are no novelty in EU law. However, DSA enforcement is also vertically centralised, which is markedly different from other fields of law. For example, in data protection law the Commission is a relatively subordinate actor due to the strict independence of national data protection authorities,Footnote 140 and in consumer protection law, the Commission does not have significant enforcement powers, but is tasked with streamlining enforcement by independent national authorities.Footnote 141 As discussed in Section II.1, the Commission has exclusive competence over VLOPSE-specific obligations under Section 5 of Chapter 3, and can initiate proceedings against VLOPSEs for other obligations if appropriate. The Commission’s position raises concerns about the independence requirement for enforcement of the DSA, since the Commission is an inherently political institution.Footnote 142 Adding to this, the Commission takes significant powers regarding fundamental rights of EU citizens, despite the DSA being entirely based on a legal basis of market harmonisation under Article 114 TFEU, creating the concern for competence creep.Footnote 143 From a practical perspective however, the Commission’s involvement alleviates the enforcement burden on national coordinators for dealing with the largest intermediary service providers, and prevents regulatory capture of a single national authority as it occurred in data protection.Footnote 144 The vertical centralisation of enforcement in the DSA also mitigates concerns arising from a one-stop-shop model; however three factors troubling freedom of expression in the DSA’s enforcement remain: (i) different scope of freedom of expression being facilitated in the DSA (ii) differences in the procedural frameworks in member states, leading to different levels of protection, and (iii) differences at an institutional level – the DSA is enforced by different types of institutions with varying levels of enforcement experience and resources, which can affect the nature of enforcement and its effectiveness.
2. Different scope of freedom of expression facilitated by the DSA
Fragmentation in the application of the DSA may be caused by substantively different interpretations of its contents by Digital Services Coordinators.Footnote 145 The primary concern is the wide definition of “illegal content” in Article 3(h), a core concept throughout the DSA. In its current phrasing, Article 3(h) is a content-agnostic enigmaFootnote 146 : “any information that […] is not in compliance with Union law, or the law of any Member State which is in compliance with Union law.”Footnote 147 The definition is central in the DSA and weighs upon aspects that the DSC needs to enforce. In theory, its scope of application is near-limitless; Mauritz explains that because article 3(h) obfuscates the question what law is applicable, because it lacks a clear connecting factor connecting circumstances of the case to applicable national law.Footnote 148 Holznagel underlines this with an exampleFootnote 149 : holocaust denial is explicitly criminalised in some countries in the EU, such as Germany and France, but not in other countries, where it may be covered under more general hate speech provisions.Footnote 150 Another example: currently Hungary’s anti-LGBT laws outlaw companies advertising in solidarity with the LGBT-community.Footnote 151 This begs the question whether, for example, the Irish DSC must sanction intermediary service providers that fail to suspend accounts spreading content related to these examples under article 23(1).Footnote 152
The only back-stop delimiting “illegal content” in the DSA is found in the phrasing: “in compliance with Union law,” meaning that the Charter can render any law incompatible with EU law inapplicable. This operationalises the principle of mutual trustFootnote 153 and confirms a basic understanding about what defines illegal content. This appears to be a sufficient safeguard, but in practice it creates a number of tensions. Firstly, it forces entities seeking a definition for “illegal content,” such as providers of intermediary services, trusted flaggers, out-of-court dispute settlement bodies, but also Digital Services Coordinators, to evaluate whether the law they seek to apply is compliant with the Charter. This is a complicated endeavour that, aside from raising a range of legitimacy- and rule of law concerns related to this quasi-constitutional review of the litigious set of laws, could exceed the expertise of such entities, therefore proving an ineffective backstop. Secondly, it is unlikely that the interpretation of this backstop is uniform, potentially endangering the coherence of Union law. Article 61(2) attributes a harmonising role in this matter to the Board of DSCs.Footnote 154 However, this does not provide a clear avenue for DSCs seeking a guiding interpretation of the DSA with the Board, and even less so for out-of-court dispute settlement bodies and trusted flaggers, who ultimately interpret Article 3(h) by themselves. This may result in content being removed on the basis of Charter-incompatible or inapplicable law, contravening the legality requirement of interference with the right to freedom of expression laid down in Article 52(1) CFR.Footnote 155
3. Different procedures on a national level
Differences in the implementing law of the DSA can also affect its application. This occurs when, under the discretion that Member States have to appoint supervisory authorities for each of the DSA’s oversight tasks, different entities across member states can have the same supervisory tasks under the DSA.Footnote 156 The degree to which these enforcement structures can affect enforcement can be illustrated using the Netherlands and Austria as an example. An internet user in Austria complainsFootnote 157 with the Austrian DSC (KommAustria) on being presented with advertisements based on profiling as defined in Article 4(4) GDPR,Footnote 158 by the provider of an online platform with its main establishment in the Netherlands, which infringes on Article 26(3) DSA.Footnote 159 The Austrian DSC must refer this matter to the Dutch DSC (Autoriteit Consument en Markt).Footnote 160 The Dutch DSC will under the Dutch implementation law forward the case to the Dutch Data Protection Authority (‘DPA’), who is attributed the task to oversee Article 26(3).Footnote 161 The Dutch DPA can then fine the intermediary service provider or make binding commitments, also under Dutch administrative law.Footnote 162 In the reverse scenario, if the user were a Dutch citizen complaining about an online platform established in Austria, the Dutch DSC would forward the case to the Dutch DPA.Footnote 163 The Dutch DPA would forward the case to the Austrian DPA, who then forwards the case to KommAustria, because KommAustria is the competent authority to enforce the DSA, including Article 26(3) under the Austrian implementation.Footnote 164 Final result is that, depending on the national implementation of the DSA, the same complaint is addressed by two different institutions.
The enforcement structure illustrated above is complex, raising concerns that are only somewhat mitigated in the DSA. The reality is that this enforcement structure can be further complicated if composite procedures are added because the Member State has a federal structure, meaning that competences may be attributed to regional rather than national authorities. In other fields of law, complex enforcement structures have been shown to ultimately lead to reduced accountability and protection of individual rights.Footnote 165 In DSA enforcement, each coordinating DSC is responsible for the uninterrupted functioning of their national composite procedures.Footnote 166 However, information deficits due to insufficient sharing of informationFootnote 167 ; supervision of administrative action; and, in the case of one-stop-shop mechanisms, administrative paralysis due to the overload that can arise from many platforms being located in a single member stateFootnote 168 may interrupt composite procedures. Similarly, complaints under Article 53 suffer reduced potential for judicial review if they involve a cross-border aspect, since in principle only courts of the same legal order are able to review decisions of administrative authorities. This implies that complainants in a different jurisdiction may not be able to challenge a decision taken by a foreign DSC if it is adopted by their own DSC in their own jurisdiction, or that such a challenge requires the national court of one jurisdiction to review the assessment of that of another court in a different jurisdiction. Seeing as the national law applied in these cases is crucial (see II.2), this may leave complainants with limited potential for judicial review.Footnote 169
Administrative law can also disperse levels of protection under the DSA. DSCs are ultimately rooted in their respective national legal orders, creating an interaction between national and EU law. Rules for such interactions are missing, according to van Cleynenbreugel.Footnote 170 National authorities could afford more protection under national administrative law than required under principles of EU administrative law,Footnote 171 resulting in different levels of protection, which may subsequently affect the right to freedom of expression. National administrative law, then, determines the level of protection that citizens have, potentially creating tensions in cross-border setting.Footnote 172 For example: appealability may differ under national administrative law. That affects decisions such as appointing trusted flaggers, out-of-court dispute settlement bodies, bilateral commitments with providers of intermediary services; all of which are decisions that shape the freedom of expression on the internet. If one member state has strict time limits regarding administrative procedures, but an internet user is waiting for an enforcement decision against a provider of intermediary services in the jurisdiction in which the intermediary service provider is located, it may lead to a situation in which enforcement decisions are not appealable because delays are caused across jurisdictions. This diminishes the protection and right to judicial review of EU internet users.
4. Different enforcement strategies leading to different applications of the DSA
Member States can select existing authorities, found new authorities, or select multiple authorities as DSC; currently, they overwhelmingly opt for appointing existing authorities.Footnote 173 The nature of the institution hosting the DSC affects the application of the DSA in three ways: (i) differences based on available resources, (ii) differences based on enforcement strategies and priorities; (iii) differences based on enforcement traditions of the national authority selected as DSC. Different enforcement strategies indirectly shape the freedom of expression of internet users: they determine the degree to which intermediary service providers can self-regulate, how strictly the DSA is enforced, but also how internet users are protected through effective remedies for potential infringements.Footnote 174
The first instance that causes different levels of enforcement is related to resources and available expertise. Article 50(1) requires that DSCs have the necessary technical, financial, and human resources to supervise all providers of intermediary services in their competence,Footnote 175 solidifying sincere cooperation in DSA enforcement between Member States.Footnote 176 Some DSCs will therefore have to be larger than others, due to the number of providers of intermediary services established in their territory. However, differing levels of resources, particularly under-resourcing, can impact DSA enforcement. Orlando-Salling proposes that the financial resources of DSCs in “peripheral”Footnote 177 Member States, against a background of fiscal and debt crises, could strain the enforcement of the DSA in those member states.Footnote 178 Orlando-Salling rightly notes that this creates a disbalance in the degree to which the DSA is shaped by “core” Member States, where citizens of “peripheral” Member States could be less protected.Footnote 179 As a result DSCs with limited resources may not have the full expertise required to deal with complicated matters at scale. Differences in resources in other sectors with a decentralised enforcement structure, such as competition law and data protection, are found to lead to fragmented enforcement.Footnote 180 Being under-resourced can cause (a) case-overload and (b) failure to investigate potentially new infringements, ultimately endangering the protection of fundamental rights – such as the right to freedom of expression in the case of the DSA. For example, in a data protection setting, the EU Agency for Fundamental Rights notes that insufficient resourcesFootnote 181 has caused DPAs to prioritise certain enforcement tasks causing an inability to take action against potential harms to data subjects.Footnote 182 Strowel and Somaini theorise that the Irish DSC will be strained for resources due to the large number of providers of intermediary services established in that Member State.Footnote 183 Similarly, underfunding and understaffing is also identified as a barrier in consumer law enforcement.Footnote 184 Differences in resources could ultimately affect DSA enforcement as well, and thus how freedom of expression of EU internet users is protected by DSCs.
The second instance that can affect DSA enforcement is varying enforcement strategies. Decentralised authorities can serve national interests in enforcing EU law.Footnote 185 Gentile and Lynskey describe how different enforcement strategies led to a fragmented application of the GDPR.Footnote 186 Enforcement tasks are interpreted differently across competent authorities: some authorities are more aggressive in enforcement, others are more idle.Footnote 187 National interests can play a role in this regard: being a lenient authority can be a way to attract such companies, A-G Bobek observes in Facebook Belgium. Footnote 188 This may also apply in a DSA context: DSCs may choose to be more lenient in investigating and sanctioning intermediary service providers to maintain their establishment in that jurisdiction. Lenient enforcement can affect internet user’s rights, de facto sacrificing protection of freedom of expression online for economic gain. These practices can inspire forum-shopping, meaning that intermediary service providers determine their establishment based on that stance the DSC in that jurisdiction takes toward enforcement.Footnote 189
A final factor contributing to different enforcement strategies are the enforcement traditions on which appointed DSCs build. While some DSCs, building on an institution concerned with media law, can rely on expertise on matters concerning freedom of expression, other DSCs building on consumer law or traffic authorities, have less experience in this regard. At the very least, these authorities may have different priorities based on the field of law in which they are rooted.Footnote 190 Conversely, entities rooted in a media law background may not have enforcement experience to fully exhaust the investigative and sanctioning powers outlined in I.3.c and I.3.d, since their respective field traditionally does not involve dawn raids and hefty fines as competition law or consumer law do.Footnote 191 This, again, requires expertise-building on the level of the DSC, which may lead to different enforcements of the DSA.
5. Interim conclusion
Sections II and III have described the role of the DSC in relation to the internet user’s freedom of expression. On the one hand, the DSC shapes the freedom of expression through powers that affect the content moderation process; on the other hand, it protects the freedom of expression of internet users through sanctioning intermediary service providers and enforcing the DSA. In that regard, it is likely that the DSA is enforced differently across member states, because of different national landscapes, interests and administrative procedures. While this is not problematic in itself, the protection of freedom of expression under the DSA can be a concern when its enforcement leads to extraterritorial enforcement of inapplicable laws, or when enforcement strategies and practices harm the protection of freedom of expression of internet users. This supports the assertion in the introduction: there is a seeming disconnect between the profound impact that DSCs can have on freedom of expression of internet users and the safeguards surrounding their role. This is addressed in the next section.
IV. Suggestions for better safeguarding freedom of expression in DSA enforcement
This Section describes some avenues for better safeguarding freedom of expression through the application of the DSA. These are by no means exhaustive, and some of the risks identified above are inherent to the chosen enforcement structure in the DSA, and cannot be altered without significant DSA amendments. However, two avenues fitting the existing framework of DSA enforcement may mitigate the risks to freedom of expression outlined above: centralisation in cases of common concern – more so than currently foreseen in Article 56 DSA – to ensure better protection of freedom of expression in such cases; and guidance by the Board in standardising DSA enforcement, on the interpretation of illegal content, and better streamlining exchanges on national law, for more coherent DSA enforcement with regards to freedom of expression.
1. Further centralise DSA enforcement
Firstly, in complex cases with significant fundamental rights impact beyond VLOPSEs, the Commission and/or the Board should be involved in enforcement. DSA enforcement is vertically centralised for VLOPSEs, but some platforms with less than 45 million active monthly users cause similar concerns despite not meeting the “simple” threshold of monthly active users.Footnote 192 For example, Section II.1 mentioned the relevance of non-VLOPSE platforms such as Telegram or Twitch. Instances in which non-VLOPSEs can affect European elections, involve widespread censorship allegations, or cause significant harm to European citizens across borders, could warrant enforcement beyond the competence of the DSC of establishment. In the context of data protection, which lacks vertical centralisation compared to the DSA, Bastos and Palka suggest that enforcement must be centralised with the Commission or the EDPB in “cases of common European concern”Footnote 193 to overcome enforcement deficits and protect constitutional rights to data protection.Footnote 194 Despite VLOPSEs already accounting for centralisation – to a degree – in cases of common European concern,Footnote 195 further centralisation in sensitive cases avoids complications arising from enforcement at the national level described in Section III. Centralising enforcement in sensitive non-VLOPSE cases can ensure that freedom of expression is not harmed by inactive or politically motivated DSCs, or that the application of national law in such cases would create interferences with freedom of expression across the EU. This can be achieved in two ways: vertical centralisation with the Commission, and with the Board.Footnote 196
Attributing enforcement competence beyond VLOPSEs to the Commission in cases of common European concern requires amendment to the DSA. Centralisation is enabled to some extent by the procedure laid down in article 59 on referral to the Commission, but that article only foresees the Commission reviewing enforcement actions taken by national authorities, not actually deciding on enforcement itself. This means that the risk of regulatory capture or administrative paralysis in sensitive cases is still located at the national level. The Board could make a recommendation of whether online platforms have a particular sensitivity toward fundamental rights in the EU that invokes the competence of the Commission, much akin the framework of Article 58(2). Complaints by service recipients across the EU are incorporated in assuming this competence, and maintains proximity of the complainant to the procedure that will ultimately affect their freedom of expression, while simultaneously aligning with the wish of the Parliament for more involvement of the Commission in the DSA’s enforcement.Footnote 197 This ensures that the DSA’s application is not nationalised, and “difficult” platforms do not necessarily need to be supervised by a single DSC; essentially, it sacrifices a degree of member state competence for more efficient enforcement. Of course, there are downsides to this approach: the Commission is not officially attributed competence in these manners, and its competence would require member states ceding additional enforcement powers. This approach may also contravene the principle of mutual trust, because it implies ineffectiveness of national enforcement in particularly sensitive cases.Footnote 198 Additionally, the critique of Section II on the independence of the Commission as an enforcement actor in the DSA framework still stands; this suggestion would therefore require a DSA amendment or member states to explicitly cede additional enforcement competence to the Commission. Enforcement in such instances could also be advised by the Board under article 63(1)(d), further solidifying Europe-wide support for the Commission’s enforcement action.
Centralising enforcement with the Board could be achieved by allowing the Board to issue binding recommendations on initiating joint investigations. Currently, the Board may recommend joint investigations to the Member State of establishmentFootnote 199 ; if the Member State of establishment disagrees with the requested enforcement action, they communicate so in their preliminary position.Footnote 200 If that is deemed insufficient, the Board may refer the matter to the Commission for dispute resolution under Article 59, which would ultimately require the DSC of establishment to report back to the Commission, having taken “the utmost account” of the Commission’s view.Footnote 201 Allowing the Board to make binding requests for joint investigations, altering Article 60, ensures enforcement in cases of common European concern, even if the DSC of establishment is unwilling to enforce the DSA, without the complexity of an infringement procedure ex Article 258 TFEU. This would of course affect the procedural autonomy of member states; to somewhat justify that, the DSA would need to be amended, and requests should be justified by a majority or qualified majority of DSCs.
2. Further guide DSA enforcement
A second avenue that should be pursued is better use of guidelines by the Commission and the Board to guide DSA enforcement in alignment with freedom of expression. This is already explicitly established throughout the DSA, for example in the context of the selection of trusted flaggers,Footnote 202 and for the Board on the development of European guidelines.Footnote 203 Some aspects in the DSA require further guidance by the Commission and the Board. Practical guidelines on ensuring the independence of DSCs from their hosting authority are currently missing; as illustrated in Section II.3.a, the “thin” independence requirement can blur competences and create questions on the nature of DSA enforcement. Similarly, the Board must coordinate harmonious application of the DSA – the Commission may, through its seat on the Board, nudge the Board in that direction. Guidelines should be issued on interpreting illegal content in Article 3(h); these guidelines should address (a) the connecting factor of a set of laws to a particular caseFootnote 204 ; and (b) how the safeguard of compliance with EU law for national law can best be interpreted by actors relying on the definition of Article 3(h).
Further, the interpretation of national laws may, in cross-border cases, be dependent on foreign DSCs. In cases of unclarity of the substance of those laws, there are options for sharing information and mutual assistance between DSCs. The lex ferenda proposed by Rademacher and Marsch for “the introduction of an explicit horizontal preliminary ruling procedure under EU secondary law” streamlines this even further.Footnote 205 If a national court of the competent DSC, or the competent DSC itself, has issues interpreting aspects of a foreign law, they may refer those issues back to a court or the DSC in the requesting member state. This may prevent issues of extraterritorially or wrongly applying national laws identified inter alia under II.2, and it has the added benefit of allowing the complainant to be in closer proximity, and perhaps even appeal, decisions taken by the court of DSC in their member state in that preliminary reference procedure. The downside of this proposed solution is that the risks of composite procedures, such as delay and administrative paralysis, compound when the composite procedure requires authorities to refer issues back to member states that had originally forwarded the complaint; as such, the expedience of the procedure is an important factor to take into account before relying on Rademacher and Marsch’s proposed horizontal procedure.
V. Conclusion
This article has analysed the role of the Digital Services Coordinator under the DSA in light of the freedom of expression of internet users. It highlights how DSCs, on the one hand, shape content moderation practices, and on the other hand how the DSC’s role in the decentralised enforcement structure of the DSA affects the protection of freedom of expression of users from platform’s content moderation practices. Both of these dimensions ultimately affect the right to freedom of expression of internet users. Safeguards around the protection of that right are sparse in the DSA: as an instrument, it mostly emphasises the responsibility that intermediary service providers have with regards to respecting freedom of expression, but does not acknowledge the risks that enforcement of the DSA at the national level can pose to the freedom of expression of internet users, either by varying speech traditions, or by its decentralised enforcement framework. DSCs only have responsibility toward freedom of expression in generally phrased safeguards, without a sufficient risk mitigation mechanism in place to address manifesting harms. This contribution proposes that these risks can be mitigated by a more uniform understanding of “illegal content” facilitated by horizontal preliminary proceedings, a possible enforcement competence for the Commission or the Board beyond VLOPSEs, and exhaustive use of guidance by the Board, especially on the requirement of independence for DSCs.
DSCs have a set of instruments to affect content moderation, and to regulate platforms that ultimately determine user’s freedom of expression online. How these instruments are used is therefore key in creating a digital space in which freedom of expression is protected. The DSA talks the talk when it comes to protection of freedom of expression. If the DSA is to really protect freedom of expression online, its enforcement should walk the walk.
Acknowledgments
The author is grateful to Catalina Goanta, Peter Blok, Janneke Gerards, Viktorija Morozovaite, Lisette Mustert, and Nadya Purtova, and participants of the RESHUFFLE Young Scholars Workshop for comments on earlier drafts and insightful conversations on this topic. All mistakes are my own.