A. Introduction
This Article considers strategic litigation within the rights and enforcement architecture of the General Data Protection Regulation [hereinafter GDPR], Digital Services Act [hereinafter DSA], and Digital Markets Act [hereinafter DMA],Footnote 1 arguing for the value of private enforcement strategic litigation to further the rights of internet users while highlighting some of the barriers to bringing such cases. The GDPR, DSA, and DMA constitute distinct but overlapping legal structures for the human rights of internet users—one area of digital rights. The EU’s 2016 GDPR provides human rights regulation applicable to tech giants’ processing data about internet users, but there is a gap between the GDPR’s substance and its enforcement in practice.Footnote 2 Strategic litigation by civil society has stepped into this gap. This includes public enforcement strategic litigation against regulators to enforce the GDPR against internet giants, and private enforcement strategic litigation upholding individuals’ rights directly against companies.Footnote 3 The new DSA and DMA regulate internet giants, but their public enforcement architecture further fractures regulation across multiple regulators that could make inconsistent decisions causing incoherence in EU law. These new laws also centralize public enforcement power in the European Commission, marginalizing and disempowering civil society, which will not have standing to directly challenge the Commission in strategic litigation.
Private enforcement strategic litigation in the interests of internet users could counteract legal incoherence from regulatory fragmentation, and counterbalance disempowerment of civil society in DSA and DMA public enforcement, but there are procedural barriers to such litigation. Strategic litigation based on a private right of action against a company can incorporate relevant principles across multiple areas of law that apply to the same set of facts, promoting legal coherence by enabling judicial decisions that integrate distinct but overlapping laws. Civil society will need to overcome significant difficulties to bring this kind of strategic litigation, such as uncertainty over cross-border jurisdiction and access to legal expertise. The new EU regime for mass claims through representative actions potentially expands legal opportunity structures for private enforcement litigation under GDPR, DSA, and DMA. However, mass claims entail additional procedural requirements for standing or admissibility, which may render some of these mechanisms unusable for strategic litigation.
Efforts to advance digital rights for internet users have relied on strategic litigation as a catalyst to enliven legal rights and opportunities. Internet tech giants’ infringements of users’ human rights are diffuse, opaque, and occur at scale across millions of people. Strategic litigation can promote human rights in this context by: Developing legal rights and protections through judicial decisions; mobilizing regulators to correctly interpret and enforce the law though judicial review of a regulator; pressuring big tech companies to change their practices using litigation directly against a company; or raising awareness and seeking remedies for many users in a mass claim. Legal opportunity structures for litigation are necessary for strategic human rights litigation—civil society cannot bring such litigation if the rules of standing, for example, do not recognize individuals or NGOs. Yet, the mere existence in law of digital rights and legal opportunity structures for litigation is not sufficient to empower people against internet tech giants without the catalytic effect of strategic litigation.
I discuss different kinds of strategic litigation, which enable a range of approaches to advance digital rights with impact beyond the specific circumstances and actors in a case. Strategic litigation can arise in relation to public enforcement of the law where there is a legal challenge of a regulator’s decisions or actions aimed at establishing a particular interpretation of the law or changing the behavior of regulators. Strategic litigation also arises in private enforcement where non-state actors litigate directly against an actor that has infringed their rights.Footnote 4 Private enforcement is not limited to pursuing commercial interests—it encompasses strategic litigation brought against companies in the public interest. The Court of Justice of the EU [hereinafter CJEU] has recognized private enforcement litigation as an integral part of enforcing EU law and that damages claims help deter conduct that infringes EU law protections.Footnote 5 Mass claims against companies, which are one type of private enforcement, will tend to be strategic litigation given that they concern the rights of many individuals, although some may argue that those mass claims driven primarily from a profit motive without a broader strategy to change the behavior of companies fall short of being strategic.
My Article adds to existing literature on digital rights by providing analysis of strategic litigation opportunities across both private and public enforcement architecture for the GDPR, DSA, and DMA, analyzing these legal opportunity structures in the round and not siloed from each other. Academic literature has tended to focus on public enforcement of the GDPR by regulators, and the role of regulators in other areas of law such as consumer law.Footnote 6 The value of private enforcement strategic litigation to digital rights has been under-explored, and only a few commentators have focused on private enforcement under the DSA or DMA.Footnote 7 Analysis of the EU’s new representative action regime has largely come from the consumer law space,Footnote 8 and the interplay of this new regime with digital rights laws needs further study. Only a small number of civil society actors, mainly None of Your Business [hereinafter NOYB] and BEUC—the European consumer organization—have worked on the interplay of the new representative action regime with digital rights laws such as the GDPR, DSA and DMA.Footnote 9 My Article makes a contribution to the literature by analyzing the enforcement architecture of the GDPR, DSA, and DMA as a system, incorporating the novel area of representative actions, and focusing on power dynamics for strategic litigation by civil society to reveal shortcomings in public enforcement legal opportunity structures that could be addressed via private enforcement strategic litigation.
The next section sets out digital rights issues in the audience economy of the internet, and different kinds of civil society actors that have different priorities for strategic litigation on these issues. Section C looks at the substantive content of the GDPR, DSA, and DMA, highlighting their human rights dimensions. The DMA is also relevant for businesses—e.g. business users of big tech platformsFootnote 10 —who may also seek to bring strategic litigation; however, I limit my consideration to the rights of individual internet users. Section D outlines opportunities for civil society in these laws’ public regulatory enforcement frameworks, which are more accessible in the GDPR than DSA and DMA. Section E turns to the avenues for civil society to mount strategic litigation directly against big tech under the GDPR, DSA, and DMA, including through mass claims under the new Representative Action Directive [hereinafter RAD].Footnote 11 I outline RAD’s implementation in Germany, Portugal, and Ireland, which illustrates the highly uneven approaches in different jurisdictions that may undermine effectiveness.
B. Digital Rights, the Internet Audience Economy, and Civil Society
This section provides background on digital rights in the context of the internet audience economy, and different kinds of actors in civil society in digital rights litigation. Understanding the internet audience economy and the different kinds of civil society actors bringing strategic digital rights litigation serves as context for discussion in later sections of the legal structures for internet regulation that civil society navigates.
I. Digital Rights and the Internet Audience Economy
This Article looks at strategic litigation concerning internet tech giants to uphold “digital rights,” which are human rights in the modern digital age. Digital rights are not an entirely new set of rights, rather, the term digital rights acknowledges that the use of digital technology can negatively affect existing human rights.Footnote 12 Digital rights are not limited to privacy, but include all human rights depending on the context. For example, non-discrimination can be infringed by targeting ads at users in different demographics such as targeting ads for doctors at men. Data protection under the GDPR reflects one part of the larger picture of human rights in the digital age. This Article looks at the horizontal human rights effects for users from big tech companies such as Meta, which owns Facebook and Instagram, in the current internet model of an audience economy. These companies also have human rights impact on other stakeholders such as the workers reviewing content in content moderation systems, but present analysis is limited to individual internet users.Footnote 13
In the internet audience economy, profit is driven by advertising targeted at users based on profiles that are constructed from data about users, which are collected from a myriad of sources, then passed to vast networks of intermediaries that process the data for profiling and targeting. Amnesty International highlighted human rights risks posed by internet giants in this audience economy in the 2019 Surveillance Giants report.Footnote 14 The business model of internet giants can negatively affect many human rights, including privacy, freedom of expression and thought, and non-discrimination.Footnote 15
However, data processing and companies involved in the audience economy form a complex web that is difficult to map and understand. Van der Vlist and Helmond have used partner directories to map the audience economy, which is “a complex global and interconnected marketplace of business intermediaries involved in the creation, commodification, analysis, and circulation of data audiences for purposes including but not limited to digital advertising and marketing.”Footnote 16 Their research looked at the partnerships between social media platforms, 67 audience intermediaries “that create software tools, products, and services for shaping the creation, buying, modelling, measurement, and targeting of data audiences,”Footnote 17 and other commercial actors in the audience economy. They found 11,490 partnerships and integrations in the audience economy—partnerships are both technical and commercial arrangements, creating a vast network in which data about internet users are processed for profit.
Most of us are unaware of how data about us is extracted and exploited for profit in this vast network of commercial actors that is hidden from users as we seek information and communicate online. The audience economy is opaque to us. In this complex and opaque context, individuals struggle to enforce their rights against internet giants,Footnote 18 and civil society organizations play an important role as intermediaries to uphold digital rights and access to justice through strategic litigation.
II. Civil Society working on Digital Rights in the Audience Economy
Although I refer to “civil society” throughout this Article, there are a range of different actors in civil society working on digital rights and big tech with differing values, goals, and strategies—these different actors span the different ideal actors identified in the framing paper.Footnote 19 Digital rights and privacy NGOs play a key role in relation to big tech companies, such as the UK-based Foxglove, and Austria-based NOYB. These NGOs both have a wealth of legal expertise and from that perspective might best be understood within “the corporation” category. Individuals have played an important role in digital rights strategic litigation in part because GDPR is structured around the rights of individuals, for example litigation by Max Schrems—who founded NOYB—related to Facebook, and claims by Johnny Ryan related to targeted advertising.Footnote 20 Such individuals tend to be embedded in networks of human rights and digital rights NGOs, and thus do not fit neatly within “the loner” category in the framing paper.
Consumer protection associations are another kind of civil society actor related to big tech, which are NGOs that advocate for consumer rights.Footnote 21 Even though individuals often do not pay money to use digital services provided by big tech, individual users are nevertheless consumers in relation to the companies.Footnote 22 Consumer protection associations have taken enforcement action against internet giants under GDPR as well as relying on consumer protection and competition laws.Footnote 23 These consumer protection associations vary in size, priorities, and level of EU expertise and appetite for litigation, and so some are closer to “the organization” ideal category, while others are better characterized as “the corporation” in the framing paper.
Whilst acknowledging this complexity among the civil society actors, I treat the goal of upholding human rights from infringement by internet giants as a broadly shared concern for my discussion of strategic litigation. At the same time, the varied nature of these actors means that some will more easily overcome structural barriers to strategic litigation posed by standing. “Civil society” in this Article encompasses both individuals and organizations acting in the public interest.
C. EU Law’s Regulation of Big Tech: GDPR, DSA, and DMA
Having introduced civil society actors advancing digital rights in the audience economy of the internet, I now turn to the key laws that form the legal structures these actors navigate to set out their overlapping content and show where their legal opportunity structures do and do not provide strategic litigation opportunities. This section discusses the rights and protections in the GDPR, DSA, and DMA as key digital rights instruments. However, this is not a comprehensive analysis of applicable laws because, for example, consumer protection and competition laws also apply to the conduct of big tech towards users, as later sections touch on. The GDPR, DSA, and DMA are all regulations that are directly applicable across member states, harmonizing EU digital rights law on internet giants.
I. GDPR
Adopted in 2016, the GDPR requires that personal data—data about individuals—be processed in line with principles and protections that reflect the right to data protection contained in Article 8(1) of the EU Charter of Fundamental Rights. Data protection is not only a matter of privacy. The data protection principles set out in GDPR Article 5 include requirements that personal data be processed lawfully, fairly, and transparently; that personal data be collected for specific legitimate purposes and not processed for other purposes, commonly referred to as “purpose limitation;” and that personal data be accurate.
Individuals, referred to as “data subjects,” have a set of rights under the GDPR, which should have the overall effect of enabling individuals to control data about them. These include rights to have inaccurate personal data rectified; “data portability” meaning that an individual can move their data; and to object to some kinds of data processing.Footnote 24 There are GDPR obligations for data “controllers” which is any person or entity that determines the purposes and means of processing personal data, and “processors” who process personal data on behalf of a controller. The GDPR is not limited to big tech, applying generally to data processing by any actor in most contexts. “Processing” is defined extremely broadly, including collection, structuring, storage, alteration, and erasure.Footnote 25
II. DSA
The DSA came into full effect on February 17, 2024, regulating internet intermediary services for “a safe, predictable and trusted online environment that facilitates innovation and in which fundamental rights enshrined in the Charter, including the principle of consumer protection, are effectively protected.”Footnote 26 “Intermediaries” are digital services that shape our use of the internet, including social media platforms such as Facebook, search engines such as Google, and online marketplaces.
Unlike the GDPR, the DSA provides very few substantive protections for internet users, instead taking a procedural approach. The DSA introduces a package of transparency and procedural measures for online platforms, services that host content, to address illegal content by content moderation.Footnote 27 DSA procedural provisions require that platforms establish a mechanism to allow civil society to notify platforms of illegal content, and for platforms to take action in terms of content moderation—“notice and action” mechanism.Footnote 28 The DSA provides additional procedural obligations for very large online platforms (VLOPs) and very large online search engines (VLOSEs). VLOPs and VLOSEs are those with 45 million or more average monthly users.Footnote 29 The additional obligations primarily concern assessment and mitigation of “systemic risk,” which includes negative effects on human rights.Footnote 30
Alongside these procedural provisions, the DSA has a few substantive protections for users. The DSA prohibits dark patterns, profiling users for targeted ads based on special category data such as religion or sexual orientation, and targeting ads at minors based on profiling.Footnote 31 There is also an obligation for VLOPs and VLOSEs to provide a version of their recommender systems without profiling.Footnote 32 As Farinho has highlighted, dark patterns and profiling were already subject to a degree of regulation under the GDPR.Footnote 33 The DSA extends and increases data protection by prohibiting dark patterns and restricting particular profiling and targeting practices.Footnote 34 As discussed below, these substantive prohibitions and obligations are likely to have direct effects, allowing users to bring strategic litigation against tech platforms.
III. DMA
The DMA regulates major online platforms that act as “gatekeepers” to ensure contestable and fair digital markets to the benefit of business and end users. Gatekeepers are designated based on their size; control over an important gateway between consumers and business; and an entrenched position in the market.Footnote 35 The Commission designated six companies as gatekeepers in September 2023: Alphabet, including Google and YouTube; Amazon; Apple; ByteDance, which owns TikTok; Meta, including Facebook and Instagram; and Microsoft.Footnote 36
Gatekeepers’ substantive DMA obligations—as set out in Articles 5–7—are quite detailed and technical and overlap with some existing protections such as purpose limitation, consent, and data portability under GDPR, as well as consumer law prohibitions concerning unfair practices.Footnote 37 Several DMA obligations benefit individual users by prohibiting gatekeepers from exploiting their market power in relation to anticompetitive or unfair agreements or practices, data protection, interoperability, and transparency.Footnote 38 For example, gatekeepers must obtain users’ consent to track users for targeted advertising purposes outside of a gatekeeper’s core platform service, or use personal data from a core platform service in another of the gatekeeper’s services.Footnote 39 These provisions benefitting users are likely to have an implied right of action for users to litigate against gatekeepers based on direct effect, discussed below.
D. Civil Society’s Role in Regulatory Enforcement
Before embarking on analysis of the GDPR, DSA, and DMA’s enforcement mechanisms and potential for strategic litigation, the power dynamics of different kinds of mechanisms are worth noting. Access to justice is a well-established concept in international human rights.Footnote 40 The UN’s Guiding Principles on Business and Human Rights set out three types of access to remedy mechanisms, which I use to categorize the remedy mechanisms under the GDPR, DSA, and DMA: State-based judicial mechanisms, state-based non-judicial grievance mechanisms, and non-state based grievance mechanisms.Footnote 41 The public/private enforcement distinction cuts across the category of state-based judicial mechanisms, which is the mechanism for strategic litigation. Where the decision of a regulator is challenged in court the judicial mechanism relates to public enforcement, and where civil society litigates against a non-state actor the judicial mechanism provides private enforcement.
There are different power dynamics inherent in each type of mechanism. The role of civil society and business is structurally empowered in private enforcement through state-based judicial mechanisms and non-state-based grievance mechanisms.Footnote 42 In the courtroom of a state-based judicial mechanism, civil society acting in the public interest of rights holders will be a party to litigation with the same kind of power over the conduct of proceedings as a defendant company. By contrast, state-based non-judicial grievance mechanisms of public regulatory enforcement have an asymmetric structure where business has greater power than people.Footnote 43 When civil society makes a complaint, the regulator has powers to decide whether and how to respond, and procedural fairness rules primarily concern the company that may be subject to the regulatory decision and not the complainant. These asymmetrical dynamics are illustrated starkly in litigation over a regulator’s decision or action, where a business will tend to have a right to appeal a regulator’s decision about them, but the complainant may not have standing as discussed further below.Footnote 44
Awareness of these different kinds of enforcement mechanisms and their different power dynamics enables clearer analysis of the enforcement architecture of the GDPR, DSA, and DMA as a system, including the existence or absence of strategic litigation opportunities in these laws’ legal opportunity structures.
I. Data Subjects and Data Protection Authorities Empowered by GDPR
The GDPR requires member states to provide independent public authorities to enforce the GDPR, commonly referred to as Data Protection Authorities [hereinafter DPAs]. Where there is cross-border processing of personal data, meaning data about someone in one member state is processed by a processor or controller that is based in another member state, the DPA for the member state where the processor or controller is based, or has its “main establishment,” is the lead DPA.Footnote 45 This is commonly referred to as the “one-stop-shop mechanism” and is the reason that some DPAs, like Ireland’s Data Protection Commissioner, have an outsized role in GDPR enforcement due to major companies having their European headquarters there.
GDPR provides robust public enforcement rights to civil society, providing a legal opportunity structure for strategic litigation to advance digital rights and influence regulators. People as “data subjects” have the right to make complaints regarding GDPR infringements to DPAs under Article 77. Article 78 provides a right to judicial remedy against DPAs for their decisions or failure to act, which enables strategic litigation by civil society. Under Article 80(1), people have a right to opt-in to being represented by a not-for-profit public interest entity. The representative can be empowered to exercise the rights of complaint or litigation, discussed below.Footnote 46 These public enforcement rights effectively recruit civil society as important actors in the GDPR regime, playing a bottom-up role of raising complaints, and enhancing enforcement.Footnote 47
The GDPR’s public enforcement problems are well documented,Footnote 48 notably blockages resulting from the “one-stop-shop” mechanism and the Irish Data Protection Commission’s inaction. For example, Schrems had to sue the Irish DPA to enforce the GDPR regarding cross-border data transfers, because that DPA failed to correctly deal with his complaint.Footnote 49 More broadly, many DPAs are seen as slow and ineffectual partly because of under-resourcing, and there is the further frustration that a complaint is largely out of the complainant’s control once it is filed with a DPA, particularly in cross-border matters.Footnote 50 Differences in national procedural laws and DPAs’ practices make cross-border matters difficult to navigate for civil society at present. The European Parliament and Council of Ministers are in a legislative process for new rules on GDPR enforcement in cross-border cases that would address many of these concerns, aiming to harmonize cross-border cooperation through common procedural rules, speed procedures by setting deadlines for DPAs, and improve access to information.Footnote 51
II. Civil Society’s (Marginal) DSA Enforcement Role
There was a lot of talk among civil society about needing the DSA to learn lessons from the problems with GDPR enforcement, focusing almost exclusively on public regulatory enforcement.Footnote 52 National regulators, called “Digital Services Coordinators” [hereinafter DSCs], are responsible for enforcement at a national level.Footnote 53 The DSA reproduces the GDPR’s one-stop-shop by giving exclusive competence to the DSC of the member state where a company has its main establishment, which has been criticized,Footnote 54 although there is a two month limit for a DSC to respond to other DSCs.Footnote 55 The European Commission has exclusive enforcement powers for Chapter III Section 5 concerning systemic risks of the largest companies.Footnote 56 The time limit and European Commission’s competence respond to one-stop-shop problems in cross-border GDPR matters.
DSA public enforcement rights of internet users are more limited than the GDPR. Users have a right to lodge a complaint with a DSC under Article 53, including rights to be heard and receive information on the complaint’s status, but no explicit right for complaints to the Commission on systemic risk. Article 86 allows users to mandate an entity to exercise the users’ rights under the DSA.Footnote 57 Unlike the GDPR, the DSA does not include a right to judicial review of DSC decisions, although many member states have existing rights of judicial review for regulators’ decisions and aspects of the DSA could be judicially enforced based on the principle of direct effect.Footnote 58
III. Civil Society at the Outer Edge of DMA Regulation
The DMA provides even more limited public enforcement avenues for civil society than the DSA or GDPR. The Commission has exclusive competence for public enforcement of the DMA, with obligations to cooperate with national authorities, particularly those enforcing competition rules.Footnote 59 Third parties, which includes civil society, may inform the Commission about infringements of the DMA under Article 27. However, the Commission has full discretion on whether to follow up, and the third party has no entitlement concerning any proceedings that arise from their informing in contrast with antitrust law that entitles complainants to be closely associated with proceedings.Footnote 60 Strategic litigation by civil society to directly challenge the Commission is not possible due to the rules of standing, as the following section discusses.
IV. Centralized Regulation Precludes Strategic Litigation by Civil Society
The highly constrained rules on standing for judicial review of EU actions, described in the framing Article, Footnote 61 mean that civil society seeking to bring strategic litigation are unlikely to satisfy admissibility concerning DSA or DMA acts by the Commission.Footnote 62 Judicial review by civil society of public regulators is an important mechanism for strategic litigation as the (in)famous Schrems cases have illustrated. The Schrems cases reached the CJEU as indirect actions referred under Article 267 Treaty on the Functioning of the European Union [hereinafter TFEU], but national courts do not have jurisdiction to review DSA and DMA acts by the Commission. DSA and DMA Commission acts will be addressed to companies, which will have standing under Article 263 TFEU to challenge such acts. But civil society would only have standing if they could demonstrate direct and individual concern. Such acts will not confer rights or impose obligations on civil society, so there will be no “direct concern,”Footnote 63 and civil society will not have standing to bring strategic litigation against the Commission.Footnote 64
Civil society plays the role of a supplicant in DSA and DMA enforcement against big tech, submitting evidence of infringements to the Commission, but unable to bring strategic litigation. For example, technology investigation civil society organization AI Forensics recently celebrated the Commission’s launching investigation proceedings into Meta following AI Forensics’ report on pro-Russian propaganda ads on Meta’s platforms.Footnote 65 Yet, if the Commission falls short of robust DSA enforcement, AI Forensics has no legal recourse. By contrast, there have already been many legal challenges by big tech companies to the Commission’s early regulatory actions under the DSA.Footnote 66 Legal interpretation of these new digital rights instruments looks set to become a site of contestation between commercial interests and the Commission, but civil society will be voiceless in court, particularly because of the restricted approach to third party interventions which tend not to be allowed before the CJEU.Footnote 67
The structural power dynamics of public enforcement through state-based non-judicial mechanisms, noted above, asymmetrically disempower people from strategic litigation while giving business a central role, particularly in the DSA and DMA compared to the GDPR. These Acts aim to regulate internet giants and uphold the rights of users, but the lack of litigation rights for users in relation to the Commission mean that the GDPR may remain the main site for strategic litigation by civil society concerning regulatory enforcement of digital rights. Meanwhile, internet giants will shape regulatory interpretation of the DSA and DMA through litigation.
V. Fractured and Fragmented Regulatory Enforcement of Digital Rights
The increasingly complex regulatory landscape places a burden on civil society to navigate the multiplicity of regulators that have competence to uphold digital rights in the internet audience economy.Footnote 68 This multiplicity of regulators at national and EU levels results from multiple areas of relevant legislation with parallel enforcement regimes relevant to big tech, such as equality laws which are not an area discussed in detail in this Article, but which could overlap with GDPR and DSA protections.Footnote 69 EU equality legislation requires member states to establish equalities bodies.Footnote 70 So, for example, if racist content was amplified by recommender systems based on personal data about users in France, then civil society would need to decide whether to file complaints to the DPC Le Régulateur de la Communication Audiovisuelle et Numérique, DPA Commission Nationale de l’Informatique et des Libertés, or equalities body Défenceur des Droits. Regulators themselves bring strategic litigation, including against big tech companies as illustrated below in Meta v. Bundeskartellamt, although I have not considered this in detail because my focus is on civil society actors in strategic litigation. However, this role of regulators in litigation is worth bearing in mind since the allocation of finite resources among multiple different regulators risks leaving regulators under-resourced to litigate against big tech.
DSCs and the Commission as DSA and DMA regulator will be added to the already fragmented landscape of regulators enforcing digital rights against big tech that could lead to inconsistent decisions and incoherence of the law.Footnote 71 For example, the CJEU Meta v. Bundeskartellamt decision in 2023 considered regulatory overlap between competition law and the GDPR.Footnote 72 The German competition regulator found that Meta abused its dominant market position by collecting personal data on and off Facebook and linking those data to users’ profiles to target advertising. The CJEU concluded that a competition authority has competence to determine whether the GDPR has been infringed where such determination is necessary to establish whether there has been an abuse of dominant market position contrary to competition law.Footnote 73 However, it remains to be seen what might happen if regulators take divergent views of the law, for example the Irish Data Protection Commission could find Meta’s practices did not breach the GDPR, while competition regulators in other member states found abuse of dominant market position arising from GDPR breaches. The CJEU anticipated this risk and emphasized the importance of the duty of sincere cooperation between supervisory authorities, requiring cooperation to ensure consistent application of the law.Footnote 74
Private enforcement through strategic litigation, discussed in the following section, can promote legal coherence through judicial decisions that integrate the application of different areas of law to the same facts.
E. Civil Society’s Opportunities to Use Private Enforcement for the Public Interest
Access to justice requires that users themselves have access to remedy, even if public regulators were to perfectly apply users’ digital rights against big tech companies. Article 47 of the EU Charter of Fundamental Rights provides for access to justice and the right to an effective remedy, including that “[e]veryone whose rights and freedoms guaranteed by the law of the Union are violated has the right to an effective remedy before a tribunal”.Footnote 75 In some cases regulatory action will have a similar effect as private enforcement, for example where declaratory or injunctive relief are sought, but not where users seek redress from a company. For example, where a regulator imposes a fine on a company that fine will go to the state, whereas if people sue the company for the same breach then the claimants will receive any damages awarded.
Private enforcement can also benefit the public interest from political and economics perspectives by mitigating the risk of a lack of political will for public enforcement of the law and reducing the financial burden on regulators.Footnote 76 Amplification of (dis)information on the internet based on processing of personal data to profile users is a deeply political subject, particularly during elections. Enforceable human rights can provide a counterweight to majoritarian viewsFootnote 77 or political influences that could arise that oppose robust public regulation of big tech’s impact on human rights. Private enforcement strategic litigation also produces public goods of court decisions and precedents advancing legal interpretation.Footnote 78
I. Turning Towards Private GDPR Enforcement—Problems of Standing
The GDPR expressly provides robust private enforcement rights in parallel with the public enforcement rights enabling strategic litigation, and private enforcement strategic litigation has been an important part of advancing GDPR digital rights.Footnote 79 For example, although not related to internet users as such, litigation against Uber has used the GDPR to advance workers’ rights, setting new precedents on transparency of data processing and automated decision making by companies.Footnote 80 People have the right to bring litigation directly against a data controller or processor for GDPR infringements and receive compensation.Footnote 81 Civil society organizations are increasingly focused on private enforcement of the GDPR through strategic litigation against big tech, responding in part to the problems of public enforcement discussed above.Footnote 82 However, prior to RAD, discussed below, civil society organizations had limited private enforcement avenues due to lack of standing.Footnote 83
Some consumer protection organizations have found a work around using standing under consumer protection law to bring strategic litigation. In 2022, the CJEU found that consumer protection associations had standing to bring claims for unfair commercial practices and consumer protection law infringements that related to GDPR infringements in Meta Platforms Ireland Limited v. Bundesverband der Verbraucherzentralen und Verbraucherverbände – Verbraucherzentrale Bundesverband e.V. Footnote 84 The German Federal Association of Consumer Organizations [hereinafter vzbv] argues that the information disclosed by the games in the App Centre fails to obtain valid consent for data processing. Vzbz relied on their standing under the Law against unfair competition and the Law on Injunctions, which both implement EU Directives, not standing under GDPR.Footnote 85 This work around may enable more public interest private enforcement action by consumer protection organizations, framing GDPR breaches as infringements of consumer protection, itself a fundamental right recognized in the Charter under Article 38. However, there may be a problem in access to justice terms if digital rights infringements can only be remedied when they coincide with consumer protection law.
II. Enforcing Digital Rights as a Whole
Private enforcement strategic litigation enables civil society to argue multiple areas of law in a single case, as the vzbz case against Facebook illustrates, which can promote coherent interpretation of the law. The DMA and DSA do not expressly provide for private enforcement by users with the same clarity as the GDPR,Footnote 86 but many of their substantive provisions could be used as the basis for strategic litigation against internet giants. Breaches of the DSA or DMA also provide a basis for a representative action through procedural mechanisms under RAD,Footnote 87 discussed below. Civil society could bring private enforcement strategic litigation based on relevant legal protections from the DSA or DMA as well as the GDPR, competition or consumer law, relying on these laws as applicable to the facts of a case. Such an approach would foster legal coherence in judicial decisions that address multiple overlapping areas of law but requires a high level of expertise across multiple areas of law, which needs significant financial resource.
An EU law provision can be enforced in national courts through private enforcement litigation if the provision meets the criteria for direct effect and an implied right of action, which is relevant for the DMA and DSA. The criteria for direct effect have been established by CJEU case law: A provision must be clear and sufficiently precise; unconditional, and not subject to further implementation; and confer a right or provide an obligation that protects the interests of a category of people to which the claimant belongs.Footnote 88 Where a provision of EU law meets these criteria, the claimant has a right of action for litigation in national courts, based on the overarching goal of ensuring the effectiveness of EU law.Footnote 89
Some provisions in the DMA and DSA will meet the criteria for direct effect, but which ones will remain uncertain unless and until there is strategic litigation that mobilizes judicial decisions. Commentators broadly agree that Articles 5-7 of the DMA have direct effect and imply a right of action, which users can use to enforce obligations that benefit them.Footnote 90 There is disagreement on the scope for private enforcement of DSA provisions, but the substantive prohibitions and protections concerning profiling and dark patterns, identified above, probably have direct effect.Footnote 91 DSA Article 54 expressly provides a right for users to seek compensation for damages resulting from a DSA breach, although the DSA is silent on rights to other judicial remedies.Footnote 92
Jurisdiction will be an additional potential hurdle for strategic litigation by civil society against internet giants, even in cases where a right of action is expressly provided or recognized by the courts. The question of jurisdiction in cross-border cases is complicated, as discussed further below, and depends on the nature of the claim, for example consumer law or tort, and the facts of the case. Article 79(2) of the GDPR expressly provides that data subjects can litigate against companies in the claimant’s home jurisdiction, but there is no such provision in the DMA or DSA.Footnote 93
Claims for damages act as a deterrent against breaching the law, which is in the public interest, but GDPR litigation suggests that courts may be reluctant to award damages for breaches of digital rights. In the context of competition law, the CJEU has pointed to private claims for damages as “an integral part of the system for enforcement” of law, ensuring full effectiveness of legal prohibitions, and discouraging practices that breach EU law.Footnote 94 Financial compensation is technically possible for loss or damage for a user due to a breach of obligations in the DSA, under Article 54, or DMA, based on direct effect. The GDPR recognizes a right to compensation for non-material damages, which neither the DSA nor DMA provide for, yet claimants have struggled to obtain damages where their GDPR rights have been infringed even with this express right for non-material damages.Footnote 95 Similarly, difficulty in obtaining damages in competition law makes DMA enforcement for compensation based on existing principles uncertain at best.Footnote 96 Thus, even though private claims for damages are an important part of EU law enforcement, there is considerable uncertainty whether claims seeking financial remedies will be successful against internet giants and may be even more difficult under the DSA and DMA.
Private enforcement strategic litigation could advance digital rights against internet giants and facilitate legal coherence across the many applicable areas of law that provide rights and protections for internet users. However, the legal opportunity structures for strategic litigation include considerable barriers, such as uncertainty over private rights of action based on direct effect, the resources and legal expertise needed to incorporate arguments from multiple areas of law, and questions of jurisdiction and remedies. In addition, litigation in different national courts could fracture EU law, which points to the importance of preliminary references to the CJEU and EU Commission contributions in such litigation.Footnote 97
III. A Collective New Hope: RAD
The 2020 Directive on Representative Actions for the Protection of the Collective Interests of Consumers, or RAD, aims to strengthen EU consumer protection law enforcement by requiring that member states have civil procedure mechanisms for representative actions.Footnote 98 A representative action is a kind of mass claim, which is where one case addresses similar legal claims of multiple individuals, for example, the individuals all bought the same faulty model of car. In a representative action, an organization represents the interests of individuals, litigating on their behalf. Such organizations act as intermediaries and, potentially, gatekeepers for access to justice—RAD calls these organizations “qualified entities” [hereinafter QEs].Footnote 99
RAD applies to digital rights legislation and could enhance strategic litigation opportunities, but only for harm to consumers’ interests. RAD requires member states to have a representative action mechanism for both injunctive and redress measures.Footnote 100 Redress measures include compensation, price reduction, or reimbursement of the purchase price.Footnote 101 RAD applies to GDPR, DSA, and DMA, which means that if they are infringed then a representative action can be brought. Yet, as with the use of consumer protection laws to provide standing discussed above, access to justice under RAD may be limited to overlap with consumer rights and not effectively protect all human rights in the scope of the GDPR, DSA, and DMA, particularly if digital rights organizations are unable to become QEs, discussed below.Footnote 102
Although RAD is new, the pre-existing mass claims mechanism in the Netherlands illustrates the potential benefits of such mechanisms for digital rights claims against internet giants, as well as the associated procedural barriers. As explained above, the scale and diffuse harms by internet giants to users’ digital rights are complex, opaque, and occur at scale, which means that mass claims could provide a particularly important access to justice mechanism because the claims of many users can be combined in a single case. RAD’s recitals highlight that consumers navigate a digitalized marketplace and receive digital services, increasing the need for enforcement of data protection law.Footnote 103 The Netherlands introduced a new regime for representative actions in 2020, known by its Dutch acronym WAMCA, and has since then become a hub jurisdiction for representative actions against big tech companies, including litigation against Google; X, or Twitter; Facebook; Apple; and TikTok.Footnote 104 This relatively high number of claims against internet giants in the Netherlands demonstrates that many actors in litigation—lawyers, litigation funders, and claimant organizations—see representative actions as well suited for claims against these companies, but there are notable procedural barriers in representative claims.
The potential procedural complexity of a new representative action regime is also illustrated by examples in the Netherlands. Significant time and resource has been spent establishing admissibility in cases, with difficulties related to whether the claimant organization is representative and whether the interests in the claim are sufficiently similar.Footnote 105 For example, The Privacy Collective filed a representative action against Oracle and Salesforce concerning their data practices profiling internet users on August 14, 2020, which was found inadmissible by the District Court of Amsterdam on December 29, 2021 on the basis that The Privacy Collective was not sufficiently representative. However, the claim was found admissible by the Amsterdam Court of Appeal on June 18, 2024.Footnote 106 There has not yet been an outcome on the merits of the case. In time the requirements for admissibility may be clarified by judicial decisions, streamlining the process for representative actions, but until then litigation under a new representative action mechanism takes considerable time and resources.
RAD takes a pluralistic approach and does not stipulate procedural requirements: Member states can decide whether to establish an opt-in or opt-out mechanism, the process for individuals opting in or out including deadlines, and thresholds for admissibility of claims.Footnote 107 Opt-out mechanisms mean that people benefit from the litigation if it is successful unless they opt-out of participating, whereas an opt-in mechanism requires that people actively opt-in to benefit.Footnote 108
1. Standing and Cross-border Jurisdiction under RAD
QEs play a central role under RAD given that individuals can only bring claims through a QE, so which organizations can be a QE and whether they prioritize human rights will shape digital rights strategic litigation. RAD allows member states to decide the criteria for QEs to be qualified to bring domestic representative actions but sets the qualification criteria for QEs to bring cross-border representative actions.Footnote 109 A representative entity needs to have standing as a QE for a case to be admissible. The criteria for QEs to bring cross-border representative actions include twelve months of activity, a statutory purpose that “demonstrates that it has a legitimate interest in protecting consumer interests,” “a non-profit-making character,” and independence. Consumer protection associations are likely to meet these criteria and are specifically referred to in Recital 24, but digital rights organizations may struggle because of the requirement to focus on consumer interests, and therefore be unable to bring strategic litigation under RAD.
There are no jurisdiction provisions in RAD and there is considerable uncertainty over jurisdiction for cross-border actions, which is relevant to strategic litigation against internet giants. One approach would be that of the 2004 decision in Henkel, where an Austrian consumer protection association was able to bring a representative injunctive action in Austria against a business based in Germany.Footnote 110 The CJEU found that action to prevent a trader from using unfair terms in contracts was a matter relating to tort, delict, or quasi-delict and therefore jurisdiction lay where the harm would occur, which was where the affected consumers live. However, in 2018 in Schrems II, the CJEU found that where consumer claims had been assigned by others, the plaintiff could not rely on the jurisdiction afforded to that plaintiff as an individual consumer to bring proceedings in the Member state where the plaintiff lives under Regulation No 44/2001.Footnote 111 That Regulation has been recast by Brussels I bis,Footnote 112 but the relevant provisions continue to raise the question of whether cross-border actions under RAD would be treated like Henkel and able to be brought in a QE’s home jurisdiction, or as assigned consumer claims that must be brought in the jurisdiction where the defendant business is based.Footnote 113 There is similar uncertainty as to how the GDPR’s jurisdiction provisions in Article 79 would be interpreted. Even if courts ultimately find that Henkel should be adopted for most claims under RAD, defendant companies will probably argue the questions of cross-border jurisdiction and QEs’ standing as delay tactics regardless of merit.Footnote 114
IV. Uneven Implementation of RAD
The next sections consider RAD mechanisms established in three jurisdictions that illustrate differences in legal systems and approaches—Germany, Portugal, and Ireland.
1. Germany
Germany has implemented RAD via the Law on the Implementation of the Directive on Associations’ Complaints, Verbandsklagenrichtlinienumsetzungsgesetz [hereinafter VRUG], passed on September 29, 2023. Part of VRUG, the Consumer Rights Enforcement Act (Verbraucherrechtedurchsetzungsgesetz), provides for mass claims for damages, and Germany did not previously have a consumer rights mass claim mechanism for damages. For a claim to be admissible, a QE must “plausibly present” or “reasonably demonstrate” (nachvollziehbar darlegen) that at least 50 consumers may be affected.Footnote 115 Consumers must opt-in by registering before the deadline of three weeks after the end of the oral hearing at first instance.Footnote 116 Thus, although it is an opt-in mechanism, individual claimants need not be identified prior to commencing proceedings.
2. Portugal
Portugal already had an opt-out mass claims mechanism under Article 52(3) of the Constitution that provides for “Popular Action” (ações populares) for citizens to uphold diffuse interests in areas that include public health, consumer rights, and environmental matters.Footnote 117 Portugal has added Decree-Law 114-A/2023, passed on December 5, 2023, implementing RAD. Portugal already regulated popular actions under Law no. 83/95 of 31 August 1995, which continues to govern cases outside the scope of Decree-Law 114-A/2023. Consistent with the existing approach to popular actions, Decree-Law 114-A/2023 provides for an opt-out procedure under which a QE represents all individuals with claims who do not opt-out from the case before the deadline of the end of the evidence phase.
3. Ireland
On 11 July 2023, Ireland’s president signed the Representative Actions for the Protection of the Collective Interests of Consumers Act 2023, implementing RAD.Footnote 118 Like Germany, Ireland has established an opt-in process, but with an earlier registration deadline of when a court decides a case is admissible. Section 19(11) requires that QEs provide sufficient information about the class of consumers affected by an alleged infringement for the Court to determine admissibility.
4. Will RAD Open the Floodgates for Access to Remedy?
Under RAD there will be representative action mechanisms in all EU member states, providing a state-based judicial mechanism for strategic litigation to benefit many people. Such mechanisms are particularly appropriate for strategic litigation against internet giants because of their diffuse human rights impact, which happens at scale. The existence of representative action mechanisms can remove the access to justice barrier presented by the absence of mechanisms to bring mass claims as was the case in most member states prior to RAD. However, other access to justice barriers remain, particularly procedural requirements for representative actions and questions of jurisdiction in cross-border claims.
Furthermore, RAD has been unevenly implemented by member states,Footnote 119 potentially reproducing some of the enforcement shortcomings of GDPR, and different procedures will vary in accessibility. For example, opt-out mechanisms are more appealing for commercial litigation funding, and easier to navigate for admissibility.Footnote 120 Some mass claim mechanisms were effectively unusable prior to RAD, and that may be the case for some RAD mechanisms.Footnote 121 Different procedural rules and uncertainty over jurisdiction under RAD are likely to pose barriers to cross-border strategic litigation as has been the case for GDPR enforcement. In addition, RAD takes a consumer law framing to private enforcement through mass claims, which may undermine its efficacy for strategic litigation on digital rights where infringements do not produce an easily recognizable consumer harm.
F. Conclusion
The EU aimed to fill a legal lacuna in regulation of the internet audience economy with the DSA and DMA, supplementing existing laws such as GDPR and consumer protection, yet their enforcement architecture may undermine realization of their goal of strengthening internet users’ digital rights. The new laws’ public enforcement architecture risks legal incoherence because of the increasing number of regulators with overlapping public enforcement mandates that could produce inconsistent or contradictory decisions on internet giants. European Commission competence under the DMA and the DSA could counteract fragmentation, but leaves civil society disempowered and companies empowered to bring strategic litigation that will influence the public enforcement of these new laws.
Private enforcement strategic litigation could enable legal coherence through judicial decisions that incorporate different areas of law, while empowering civil society to influence development of digital rights. Civil society can incorporate rights and protections from multiple areas of law in private enforcement strategic litigation, advancing digital rights while promoting legal coherence across different areas of law. Strategic litigation against internet giants based on multiple areas of law could be stronger and have more impact—some legal points may succeed even though others fail, and subsequent enforcement of multiple legal regimes following a precedent set by strategic litigation would increase the case’s impact. Yet, such cases require greater legal expertise to cover different areas of law, and any private strategic litigation against internet giants involves procedural difficulties such as cross-border jurisdiction or standing.
The possibility of private enforcement strategic litigation against internet giants in different member states itself could result in fragmentation between jurisdictions, and uneven implementation of RAD may hamper cross-border claims. Preliminary references to the CJEU and intervention by the European Commission could avoid fragmentation between member states on digital rights. The CJEU has adjudicated on the relationship between different mechanisms for remedy,Footnote 122 and the fundamental EU law principles of effectiveness and equivalence might lead to some consistency among enforcement mechanisms,Footnote 123 but only if cases reach the CJEU. Civil society will need considerable resources to gather evidence on internet giants, access wide ranging legal expertise, and navigate procedural complexity for strategic litigation before national courts and the CJEU to advance digital rights in the internet audience economy.
Acknowledgements
This Article benefitted significantly from the workshop organized by Pola Cebulak, Marta Morvillo, and Stefan Salomon, and their comments on earlier drafts, as well as the discussant for my draft Francesca Palmiotto. I also wish to thank colleagues who have provided thoughts and comments on this article and the ideas therein, including Emmanuelle Debouverie, Max Mackay, Jonny McQuitty, Ursula Pachl, Thomas Streinz, Aditi Tripathi, and Peter Wells. My analysis is informed by 10 years of working in non-profits and digital rights, most recently as Director of Litigation and Strategy at global philanthropic organization Luminate. This Article draws on my experience in the field and many conversations with partners and collaborators across NGOs, academic experts, activists, grant-makers, regulators, lawyers, and commercial litigation funders.
Competing Interests
The author declares none.
Funding Statement
There is no specific funding associated with this article.