Skip to main content Accessibility help
×
Hostname: page-component-7857688df4-xdcjr Total loading time: 0 Render date: 2025-11-13T18:29:22.849Z Has data issue: false hasContentIssue false

Introduction to Part IV

What Additional Challenges Do Vulnerable Groups Face in the Digital Realm?

from Part IV - Challenges Faced by Vulnerable Groups

Published online by Cambridge University Press:  24 October 2025

Tiina Pajuste
Affiliation:
Tallinn University

Information

Type
Chapter
Information
Human Rights in the Digital Domain
Core Questions
, pp. 355 - 360
Publisher: Cambridge University Press
Print publication year: 2025
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NC
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC 4.0 https://creativecommons.org/cclicenses/

Introduction to Part IV What Additional Challenges Do Vulnerable Groups Face in the Digital Realm?

The fourth and final part of this book looks at vulnerable groups in the digital context. The digital revolution has reshaped societies, economies, and human interactions. However, its benefits have not been equitably distributed. Vulnerable groups – those marginalised owing to socio-economic status, age, gender, ethnicity, language, geographic location, or other circumstances – often find themselves on the periphery of these advancements.Footnote 1 In many cases, the rapid adoption of digital technologies has exacerbated pre-existing inequalities and created new barriers to the enjoyment of fundamental rights. The United Nations Secretary-General has drawn attention to the fact that ‘technologies can be, and increasingly are, used to violate and erode human rights, deepen inequalities and exacerbate existing discrimination, especially of people who are already vulnerable or left behind’.Footnote 2 This part explores some of these human rights challenges for vulnerable groups, aiming to provide different perspectives in relation to the question: What additional challenges do vulnerable groups face in the digital realm? The focus areas – the digital divide, children’s rights, linguistic inclusivity in education, and the realities of vulnerable workers in the context of digital transformation – were selected because they address very different but interconnected dimensions of vulnerability in the digital environment, and to illustrate the broad range of issues that can arise in relation vulnerable groups. Together, the chapters examine how digitalisation interacts with broader social, cultural, and economic systems, often reinforcing existing disparities, but at the same time, they also draw attention to the transformative potential of inclusive and rights-based digital policies and practices.

Chapter 17. The Digital Divide: Reinforcing Vulnerabilities

Chapter 17 explores the systemic inequalities exacerbated by unequal access to digital technologies, known as the digital divide. This divide disproportionately affects vulnerable populations such as women, older people, disabled people, rural communities, and low-income individuals, limiting their access to services, education, and other opportunities. The chapter looks at the root causes of the digital divide, including socio-economic disparities, geographic isolation, cultural and language barriers, technological gaps, and insufficient policy interventions. It emphasises the human rights implications of this divide, such as restricted access to education, healthcare, and democratic participation, and examines international and regional policy responses, noting their shortcomings in addressing the issue comprehensively.

In response to the core question of this part – What additional challenges do vulnerable groups face in the digital realm? – the chapter identifies several unique challenges faced by vulnerable groups in the digital realm: (a) greater difficulty in gaining access to digital devices, services and the internet; (b) limited digital skills preventing effective use of digital technologies (fewer educational and training opportunities tailored to these groups’ needs); (c) amplification of the digital divide by existing inequalities, such as gender discrimination, generational isolation, and poverty, perpetuating cycles of disadvantage (e.g., women face online harassment and cultural restrictions, while older adults may be excluded from digital healthcare and public services); and (d) frequent overlooking of the needs of vulnerable groups by market-driven digital initiatives or inadequate regulatory frameworks.

Chapter 18. How the EU Safeguards Children’s Rights in the Digital Environment: An Exploratory Analysis of the EU Digital Services Act and the Artificial Intelligence Act

Following the broad examination of the digital divide, Chapter 18 narrows the focus to a specific vulnerable group – children – and assesses whether and how legislative efforts seek to protect them in digital environments. Eva Lievens and Valerie Verdoodt scrutinise the EU’s Digital Services Act (DSA) and Artificial Intelligence Act (AIA) from the perspective of children’s rights in the digital environment. They explore the potential of these legislative frameworks to safeguard children from harm while enabling their safe engagement with digital platforms and artificial intelligence (AI) systems. The authors emphasise the unique vulnerabilities of children in the digital realm and assess how effectively the acts consider these vulnerabilities. They answer this part’s core question by noting the following additional challenges faced by children in the digital realm: (a) exploitation of their vulnerability (AI and platform design utilise manipulative practices such as addictive design, autoplay features, and the promotion of harmful content); (b) difficulty in navigating platforms and AI systems safely (as usually there are no child-specific interfaces or adequate transparency about their operation); and (c) being less likely to benefit meaningfully from digital engagement (as there is an over-emphasis on risk and a neglect of the positive potential).

Chapter 19. Right to Education in Regional or Minority Languages: Invasions, COVID-19 Pandemic, and Other Developments

Chapter 19 explores the intersection of linguistic rights, digital access, and education during the COVID-19 pandemic. It focuses on how the rapid digitalisation of education disproportionately affected students from regional or minority language communities. Vesna Crnić-Grotić demonstrates how national and regional policies frequently failed to account for linguistic diversity when implementing emergency online education measures. This lack of linguistic inclusivity in digital education threatened the right to education for these communities as their academic progress was hindered. In relation to this part’s overarching question, the chapter notes the following additional challenges that students from minority language communities face in the digital realm: (a) less access to the internet and the digital tools necessary for participating in online education; (b) linguistic marginalisation (the dominance of major languages in digital content and platforms excluding regional and minority language speakers, denying them equitable access to education and information online); and (c) systemic neglect of linguistic diversity (and thus the unique needs of minority language groups) in educational policy and planning.

Chapter 20. Technological Acceleration and the Precarisation of Work: Reflections on Social Justice, the Right to Life, and Environmental Education

Chapter 20 examines how digital transformations have reshaped labour dynamics in Brazil, exacerbating inequalities for vulnerable workers. Raizza da Costa Lopes, Samuel Lopes Pinheiro, and Florent Pasquier show how the ‘gig economy’ and the ‘uberisation’ of work are driving a shift towards precarious working conditions, including financial instability, lack of legal protections, and opaque algorithmic control. Additionally, the chapter connects labour issues with environmental education, emphasising its role in addressing socio-economic and ecological inequalities. It advocates for integrating human rights principles, particularly the right to life and social justice, into labour and environmental policies. The chapter answers the core question of this part of the volume by drawing attention to the following additional challenges for vulnerable workers in the digital realm: (a) opaque algorithmic management, which controls their tasks and incomes while offering little transparency or recourse (leading to insecurity and potential exploitation); (b) unpredictable earnings owing to the gig economy model (often resulting in problems covering basic living expenses); (c) lack of labour protection (platform work may not have health insurance, pensions, or even adequate safety standards); and (d) stress owing to the demand for constant availability and productivity (which undermines their well-being and work–life balance).

Shared Themes and Interconnections

The four chapters collectively highlight the complex challenges vulnerable groups face in the digital realm, with each chapter addressing a specific issue regarding inequality and vulnerability. Despite the different focus areas, there are commonalities in the problems faced by vulnerable groups in those specific contexts. The main parallels follow, framed around the overarching themes that connect the chapters.

A. Structural Inequalities in Digital Access

Each chapter emphasises how structural inequalities in access and design deepen exclusion, reinforcing systemic barriers for vulnerable populations in different contexts.

  • Chapter 17 on the digital divide explores how foundational barriers, such as a lack of internet connectivity, devices, and digital literacy, disproportionately exclude vulnerable groups.

  • Chapter 18 on children’s rights shows how children are uniquely affected by systemic inequalities, such as inadequate privacy safeguards, harmful content, and a lack of user-friendly design on digital platforms.

  • Chapter 19 on minority language education demonstrates how digital education platforms (mostly designed for dominant languages) marginalise minority language speakers, especially during crises such as the COVID-19 pandemic.

  • Chapter 20 on the precarisation of work shows how gig workers, particularly in the Global South, are excluded from equitable digital participation owing to platform-centric labour models that prioritise profits over worker well-being.

B. Insufficient Policy and Governance Frameworks

The absence or inadequacy of regulatory frameworks is noted across all chapters. Gaps in regulation and policymaking can have the effect of perpetuating inequalities or leaving vulnerable groups at the mercy of exploitative systems (which is why the authors advocate for improved lawmaking and policymaking in the area).

  • Chapter 17 critiques the lack of comprehensive policies regarding digital inclusion, leaving the needs of vulnerable populations insufficiently addressed.

  • Chapter 18 notes how regulatory frameworks, such as the DSA and AIA, while promising, are often limited in scope and lack robust enforcement mechanisms for safeguarding children’s rights.

  • Chapter 19 draws attention to how policy has failed to ensure linguistic inclusivity during the COVID-19 pandemic, pointing out the lack of foresight and planning for marginalised communities.

  • Chapter 20 discusses resistance from digital platforms to regulatory efforts, illustrating how weak governance leaves gig workers without basic labour protections.

C. Amplification of Vulnerabilities during Crises

Most chapters in this part also look at the impact of the COVID-19 pandemic on vulnerable groups. The authors show how crises can act as accelerants, amplifying pre-existing vulnerabilities and exposing gaps in infrastructure, policy, and protection mechanisms across sectors.

  • Chapter 17 notes how the pandemic deepened existing digital disparities, bringing the example of telemedicine during COVID-19 that excluded older people who lack digital skills.

  • Chapter 19 focuses on the pandemic’s impact on educational access, with minority language speakers disproportionately left behind owing to digital unpreparedness.

  • Chapter 20 shows how global crises, such as COVID-19, exacerbate the precarisation of platform workers, particularly in the Global South.

And as the fourth common theme, the authors emphasise the transformative potential of inclusive rights-based interventions to counter digital exclusion and exploitation.

By progressing from structural issues to specific case studies and concluding with a global perspective, this part of the book seeks to respond to the question What additional challenges do vulnerable groups face in the digital realm? and to provide a nuanced understanding of how the digital domain interacts with the rights of vulnerable groups. Digitalisation, while offering opportunities, can disproportionately harm vulnerable groups by deepening systemic inequalities (or even exposing them to exploitation – see Chapter 21) and failing to adequately safeguard their rights. The authors draw attention to different aspects and contexts of this problem and emphasise the need for inclusive policies and practices that prioritise equity and human dignity in the digital age.

17 The Digital Divide: Reinforcing Vulnerabilities

17.1 Introduction

Digital technologies have permeated almost every aspect of modern life. The potential for such technologies to enhance the enjoyment of human rights is coupled with risks of exclusion, surveillance, and growing inequality, particularly for vulnerable populations. We need to ensure that everyone benefits from digitalisation, even those who currently lack the skills or the means necessary for it. As the European Parliament has highlighted, digital technologies ‘can either help create a more inclusive society and reduce inequities, or they can amplify existing inequalities and create new forms of discrimination’.Footnote 1 It is often the most vulnerable sectors of society that are not benefiting from digitalisation (as they tend to have fewer resources and more obstacles to access). Accordingly, their needs and human rights require special attention in this process.

The gap between demographics and regions that have access to digital technology and those that do not is called the ‘digital divide’. There is nothing new about the digital divide; it started receiving attention from the mid-1990s.Footnote 2 Despite long-standing awareness of the problem, it persists. As the United Nations (UN) General Assembly noted in 2016: ‘Despite the previous decade’s achievements in information and communications technology connectivity, […] many forms of digital divides remain, both between and within countries and between women and men. […] [D]ivides are often closely linked to education levels and existing inequalities, and we recognize that further divides can emerge in the future, slowing sustainable development’.Footnote 3

Such divides are problematic as they demonstrate how a significant number of individuals are lacking access to the plethora of benefits that digitalisation has brought (such as faster bureaucracy, access to information at all hours, new ways to express oneself). The European Parliament has recognised that digital divides ‘may accentuate social differences by reducing some workers’ opportunities to obtain quality employment’, and acknowledged the especially problematic position of vulnerable groups in relation to the digital divide by noting the potential ‘negative impact of the digitalisation of public and private services on workers and people such as older people and persons with disabilities, low-income, socially disadvantaged or unemployed citizens, migrants and refugees or people in rural and remote areas’.Footnote 4 As M. N. Cooper has highlighted, those on the right side of the digital divide ‘find themselves better trained, better informed, and better able to participate in democracy’, whereas the ‘disconnected become disadvantaged and disenfranchised’, with exclusion manifesting in all aspects of society.Footnote 5

Vulnerable groups are disproportionately impacted by the digital divide, making it both a symptom and a driver of systemic inequities.Footnote 6 In the words of the UN Secretary-General, ‘[d]igital divides reflect and amplify existing social, cultural and economic inequalities’.Footnote 7 The digital divide can perpetuate a cycle of disadvantage for vulnerable groups. Bridging the digital divide and ensuring equal access to digital technology is crucial for promoting equity and social inclusion in our increasingly digital world. This divide not only restricts access to critical services such as education, healthcare, and employment, but also undermines fundamental human rights, including the right to equality, dignity, and participation in societal decision-making. Some of the human rights implications of the digital divide are studied in this chapter to illustrate that the digital divide is not just a practical problem but also a legal one.

This chapter focuses on the digital divide in relation to women and older people as sample groups because they are uniquely positioned at the intersection of systemic exclusion and under-representation, making them illustrative of how the digital divide magnifies inequalities and contributes to human rights violations. By examining these groups, the chapter seeks to illustrate the barriers that vulnerable groups are up against, draw attention to the human rights issues that they face and demonstrate the need for tailored solutions within broader efforts to address digital inequities. The chapter also examines international action regarding the digital divide and whether additional steps need to be taken to adequately respond to the multitude of challenges that the digital divide presents.

17.2 The Digital Divide and Its Contributing Factors

There is a plethora of different definitions of the digital divide. The European Union (EU) has referenced the Organisation for Economic Co-operation and Development (OECD) definition of the digital divide,Footnote 8 which refers to ‘the gap between individuals, households, businesses and geographic areas at different socio-economic levels with regard both to their opportunities to access information and communication technologies (ICTs) and to their use of the internet for a wide variety of activities’.Footnote 9 When the term ‘digital divide’ first emerged in the late twentieth century, it was used to describe the gap between people who had access to mobile phones and those who did not. Over time, its meaning has broadened to encompass the technical and financial ability to use technology and access the internet. As technology evolves, the concept of the digital divide continues to change.Footnote 10

The digital divide is influenced by a range of interconnected factors that determine access to and use of technology for individuals and communities. Such factors include socio-economic disparities, geographic isolation, cultural and language differences, technological barriers, and gaps in law and policy. Each of these elements plays a role in determining who can benefit from the opportunities provided by digital technologies and who remains excluded. Understanding these underlying causes helps design effective strategies to bridge the divide and promote equitable digital inclusion.

First, socio-economic inequalities, such as education, employment status and income levels directly influence access to digital technologies and the internet. Low-income households often cannot afford devices or reliable internet connections. And individuals with a limited educational background may lack the skills to effectively use digital tools.Footnote 11 This disparity is evident across various demographics and regions. For instance, in India, the digital divide is heavily influenced by income and educational attainment, particularly among disadvantaged caste groups.Footnote 12 Second, remote or rural regions often suffer from a lack of investment in broadband and mobile networks (owing to higher costs and logistical challenges). Geographic isolation hinders digital accessibility, creating a stark gap compared with urbanised areas that have robust technological infrastructure. A similar tendency persists at the international level, with developing countries lagging behind developed nations in technology uptake. Third, cultural norms and language differences often limit the inclusivity of online spaces. Many websites and digital tools are predominantly available in a small number of global languages, creating obstacles for non-native speakers or those of linguistic minorities.Footnote 13 Cultural attitudes towards technology, such as mistrust or unfamiliarity, can further deepen this impact. Fourth, technological and infrastructure barriers are among the more obvious causes of the digital divide. The lack of broadband networks and high device costs clearly restrict access to digital technologies. The quality and speed of available internet also vary, affecting user ability to engage fully with digital services. And fifth, regulatory frameworks and government policies can play a critical role in determining the availability and affordability of digital infrastructure. Moreover, inadequate support for public digital initiatives or over-reliance on market-driven models can exclude marginalised populations. It has also been argued that sometimes groups or entities can use ‘political institutions to enact policies that block the spread of the Internet’.Footnote 14

Summing up, these challenges perpetuate unequal access to technology and its benefits. Recognising the interconnected nature of these factors is essential for fostering digital equity and ensuring that the benefits of technology are accessible to all.

17.3 Gender Gap

The digital divide manifests differently across various groups, highlighting distinct patterns of exclusion. Among these, the gender gap and the age gap are particularly significant, as they reflect systemic barriers rooted in social, cultural, and economic inequalities. These gaps not only reveal the unique challenges faced by specific populations but also illustrate the broader structural issues that perpetuate digital inequities worldwide.

One of the most widely recognised digital divides is the gender digital divide (gender gap). According to the International Telecommunications Union (ITU), 70 per cent of men are using the internet worldwide, compared with 65 per cent of women, meaning that globally there were 244 million more men than women using the internet in 2023.Footnote 15 In low-income countries only 20 per cent of women have access to the internet (compared with 35 per cent of men).Footnote 16 Yet, as UN Women has emphasised, ‘digital inclusion and literacy are critical to the well-being and success of women and girls in society, including their ability to take an informed part in electoral processes and exercise their right to vote and to stand for election’.Footnote 17

17.3.1 Specific Issues Faced by Women and Their Human Rights Implications

The gender digital divide highlights the problems women and girls can encounter in accessing and using digital technologies, particularly in developing countries.Footnote 18 While digital tools offer opportunities for education, economic empowerment, and social engagement, systemic barriers rooted in cultural norms, economic inequalities, and safety concerns disproportionately hinder the digital inclusion of women. This section explores the distinct obstacles women face in their digital journey and looks at the impact of the gender gap on women in relation to various aspects of their lives, including employment opportunities, education, and social inclusion. The effects are placed in the context of human rights to pinpoint the potential human rights infringements arising from the gender gap.

Women in many regions face significant barriers to accessing digital technologies owing to affordability issues and limited infrastructure, particularly in low-income and rural areas. The lack of affordable devices and reliable internet disproportionately affects women, as they are more likely to have lower incomes and fewer economic opportunities.Footnote 19

A particularly difficult aspect to grapple with is the existence of entrenched gender norms and societal expectations, which may discourage women from using digital technologies or pursuing education in digital skills. Cultural biases can restrict women’s access to public spaces such as internet cafés or limit their ownership of devices.Footnote 20 For example, in Jordan, societal attitudes even result in university educated men being uneasy about allowing women equal access to the internet and computers, reinforced by cultural mores and educational institutions.Footnote 21

Moreover, in many (especially developing) countries, women tend to have fewer opportunities for formal education and training, which results in them lacking digital literacy and skills. This limits their ability to use technology effectively and benefit from its advantages. For example, in India, women’s digital competencies are significantly lower than men’s, influenced by household dynamics, caste, and limited digital exposure.Footnote 22 And if women do access the internet, they often end up having a more negative experience than men, owing to online harassment and abuse, which disproportionately affects women, deterring them from engaging with digital platforms. Online abuse infringes upon their right to privacy and security, as laid down in human rights instruments such as the Universal Declaration of Human Rights (UDHR): ‘[n]o one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation’.Footnote 23 Women are often targeted with gender-based violence online, such as cyberstalking, threats, and harassment, which creates a hostile environment that limits their digital participation. Many studies have concluded that women are significantly more likely to experience cyberstalking and gender-based abuse than men.Footnote 24

The gender gap brings with it many detrimental effects on the everyday lives and opportunities of women and girls. The digital divide negatively affects women’s educational prospects, impacting women and girls’ right to education and exacerbating existing gender disparities in learning opportunities. Digital tools provide critical access to educational resources, online courses, and skills building programmes, yet many girls, particularly in low-income and rural areas, are excluded owing to economic, infrastructural, and cultural barriers.Footnote 25 This exclusion restricts their ability to gain necessary competencies for academic and professional success. This educational gap further aggravates the employment divide, as women are less prepared for the digital economy.Footnote 26 Women are less likely to work in technology-related fields,Footnote 27 and are often excluded from higher-paying jobs that require technological proficiency, perpetuating economic disparities between genders and impacting the human right to work. As proclaimed in the UDHR, ‘[e]veryone has the right to work, to free choice of employment, to just and favourable conditions of work and to protection against unemployment’.Footnote 28 This free choice of employment is restricted if women are not given the opportunity to develop the skills necessary to have a choice to work in ICT or in higher-paying jobs that require technological proficiency. Moreover, the lack of digital literacy can hinder women’s ability to participate in lifelong learning opportunities, which are crucial for adapting to the rapidly changing job market.

The gender digital divide extends beyond individual impacts to affect women’s roles in their communities. Women with limited access to ICTs are less able to engage in social, community, and civic activities that are increasingly mediated through digital platforms. This exclusion can lead to a diminished voice in community decision-making processes and reduced social capital. This, in turn, impacts their human right to participate in public affairs.Footnote 29 In contrast, women who do have access to ICTs can leverage these tools for community building and advocacy, underscoring the stark contrast in opportunities based on digital access. Women with limited or no digital access often also lack confidence in their ability to learn ICT skills and have a perception that technology is not meant for them, which further limits their ability to engage with digital tools, thereby reinforcing the gender gap.Footnote 30

Consequently, the gender gap also significantly restricts women’s freedom of expression, limiting their ability to participate in public discourse, advocate for their rights, and engage with conversations on different levels (local, regional, global). Digital platforms offer spaces for women to voice opinions, share experiences, and connect with wider communities. When women cannot access such platforms, their marginalisation is perpetuated and traditional power dynamics reinforced. Restricted access to technology leads to their perspectives remaining under-represented in both local and global dialogues. And, of course, overall, the gender digital divide significantly undermines the right to non-discrimination and equality,Footnote 31 as it perpetuates and exacerbates systemic gender disparities in access to opportunities and resources.Footnote 32

To avoid such overarching negative impacts on women and ensure the protection of their core human rights, it is essential to find ways to bridge the gender digital divide, in order to foster equality, empower women, and ensure their full participation in the digital society and economy.

17.3.2 International Action in Relation to the Gender Gap

As the gender digital divide remains a critical barrier to achieving gender equality in the digital age, several international organisations have adopted declarations, policies, or programmes to address this issue. These organisations have aimed to bridge gaps in digital access, skills, and representation, but the approach has been haphazard and inconsistent. There has not been systematic engagement with the gender gap in high-level policy documents. This section outlines the most significant efforts in some of the international organisations that have addressed the problem at least to a certain extent.

The UN has been one of the organisations drawing attention to the gender digital divide. Already in 1995, the Fourth World Conference on Women in Beijing recognised the transformative potential of ICTs for women’s empowerment. The declaration identified ‘Women and the Media’ as a critical area, calling for equitable access ‘to expression and decision-making in and through the media and new technologies of communication’ and the promotion of ‘balanced and non-stereotyped portrayal of women in media’.Footnote 33 A prominent step, twenty years later, was to include target 5b of ‘[e]nhanc[ing] the use of enabling technology, in particular information and communications technology, to promote the empowerment of women’ in the Sustainable Development Goals.Footnote 34 Unfortunately, the only indicator that was chosen for assessing the achievement of this target was the ‘[p]roportion of individuals who own a mobile telephone, by sex’, which has limited the follow-up activity and analysis to aspects connected to this narrow indicator.

A year later, the General Assembly called for ‘immediate measures to achieve gender equality in Internet users by 2020, especially by significantly enhancing women’s and girls’ education and participation in information and communications technologies, as users, content creators, employees, entrepreneurs, innovators and leaders’, and reaffirmed its ‘commitment to ensure women’s full participation in decision-making processes related to information and communications technologies’.Footnote 35 Clearly, the goal was not reached as no concrete large-scale action followed that document.

An important development in the digital sphere was the 2020 UN Secretary-General Roadmap for Digital Cooperation.Footnote 36 Its thematic areas include digital human rights, achieving universal connectivity and digital inclusion. The implementation of the roadmap is managed and coordinated by the Office of the Secretary-General’s Envoy on Technology, established at the beginning of 2021. For women’s rights, the most important aspects of the roadmap are digital inclusion (as it emphasises the need to address the gender digital divide) and cyber-violence.

The Commission on the Status of Women (CSW; a functional commission of the UN Economic and Social Council) is the main global inter-governmental body exclusively dedicated to the ‘promotion of gender equality, the rights and the empowerment of women’.Footnote 37 The CSW’s annual sessions regularly include discussions on ICTs and digital equity. The 2023 priority theme was ‘innovation and technological change, and education in the digital age for achieving gender equality and the empowerment of all women and girls’. The agreed conclusions of that session urge governments at all levels to ‘[p]rioritiz[e] digital equity to close the gender digital divide’ and to ‘[l]everag[e] financing for inclusive digital transformation and innovation towards achieving gender equality and the empowerment of all women and girls’.Footnote 38 The document includes a plethora of well-founded recommendations and declarations of the importance of the issues, but fails to include specific measurable targets that would help ensure implementation. As there are no binding commitments coming from this document, it is unlikely that it will lead to tangible action in the short term, but it could serve as guidance to the states genuinely invested in tackling this issue.

The UN also supports gender equality in ICT through its specialised agency, the International Telecommunication Union (ITU). Since 1998, the ITU has adopted several resolutions to promote gender equality and its mainstreaming. The first resolution was on gender and telecommunications policy in developing countries.Footnote 39 In 2018, the ITU adopted a resolution on gender mainstreaming in the ITU and the promotion of gender equality and the empowerment of women through telecommunications/ICT.Footnote 40 A year earlier, in 2017, the ITU Working Group on the Digital Gender Divide adopted ‘Recommendations for action: bridging the gender gap in Internet and broadband access and use’, but follow-up activities have been very limited (just two progress reports – from 2017 and 2018).Footnote 41

The EU’s efforts in relation to the gender gap are mostly limited to the last ten years. The main relevant strategy is the EU’s Women in Digital policy, which has the aim of ensuring that ‘everyone, regardless of gender, gets a fair chance to benefit from and contribute to the digital age’.Footnote 42 In 2019, twenty-six EU countries, along with Norway and the UK, signed the Women in Digital Declaration to achieve equality in tech.Footnote 43 The signatories of the declaration agreed to take action to create a national strategy to encourage women’s participation in digitalisation, stimulate companies to combat gender discrimination at work, and advance a gender-balanced composition of boards, committees, and bodies dealing with digital matters.Footnote 44

The 2022 European Declaration on Digital Rights and Principles addresses the gender digital divide by emphasising inclusivity and gender balance as necessary elements of the digital transformation. The Declaration has the ambitious aim of ‘promot[ing] a European way for the digital transformation, putting people at the centre, built on European values and EU fundamental rights, reaffirming universal human rights, and benefiting all individuals, businesses, and society as a whole’.Footnote 45 Chapter 2 on Solidarity and inclusion proclaims that ‘technology should be used to unite, and not divide, people’ and that the ‘digital transformation should contribute to a fair and inclusive society and economy in the EU’. The EU committed to ‘a digital transformation that leaves nobody behind’ and ‘should benefit everyone, achieve gender balance […]’. And with Chapter 4, the EU committed to ‘promoting high-quality digital education and training, including with a view to bridging the digital gender divide’. The broad language (e.g., ‘achieve gender balance’ and ‘promoting […] digital education’) lacks measurable targets and enforcement mechanisms to ensure accountability.

The EU 2022 Digital Compass & Digital Decade Policy Programme 2030 (DDPP) is unique as it sets concrete targets for 2030 in areas such as digital skills, digital infrastructure, and making public services more digital.Footnote 46 It also emphasises the importance of women having equal opportunities in the ICT work sector and sets an ambitious target to increase the number of female ICT professionals, which involves increasing the number of girls and women studying ICT, both at school and at university. Importantly, EU Member States have to submit national strategic roadmaps about their actions to achieve all DDPP targets, which are published online, and report to the Commission about progress, which should add pressure on states to take action to meet the targets. This type of approach should also be adopted in relation to other aspects of the gender gap.

Other regional organisations are also addressing some facets of the gender gap in their policy. The Digital Transformation Strategy for Africa (2020–30) recommends promoting ‘gender-inclusive education frameworks and policies and boosting relevant education opportunities and digital skills development for women and girls in STEAM-subjects to narrow the gender digital divide’.Footnote 47 And at the fifteenth session of the Regional Conference on Women in Latin America and the Caribbean, the member states of the Economic Commission for Latin America and the Caribbean (ECLAC) signed the Buenos Aires Commitment, underscoring the need to support women’s participation in Science, Technology, Engineering, and Mathematics, and eliminating occupational segregation.Footnote 48 While these regional initiatives recognise the importance of addressing the gender digital divide through education and occupational inclusion, they fall short of creating systemic change. The policies lack implementation plans, mechanisms, and funds, and do not tackle deeply rooted socio-economic and cultural barriers.

In addition to policy documents, there have been several global initiatives targeted at closing the gender digital divide, including, among others, International Girls in ICT Day (ITU), the Global Partnership for Gender Equality in the Digital Age (the EQUALS initiative), the EQUALS in Tech Awards (ITU, UN Entity for Gender Equality and the Conference on Trade and Development), Gender-Sensitive Indicators for Media (UNESCO), Women on the Homepage (UNESCO), the Global Survey on Gender and Media (UNESCO), the Broadband Commission Working Group on Broadband and Gender, and the Best Practice Forum on Gender and Access of the Internet Governance Forum.Footnote 49

Despite various regional and international policy commitments and global initiatives, the gender digital divide persists (albeit slowly decreasing). While efforts by organisations such as the UN highlight the importance of integrating gender equality into the digital agenda, the lack of binding commitments and systematic implementation frameworks limits progress. There is a need for cohesive, measurable, and actionable strategies to ensure that the digital transformation benefits everyone, regardless of gender, and that the human rights of women and girls are not negatively impacted. The gender gap undermines their ability to fully exercise their rights to education, work, freedom of expression, and access to information. This not only limits individual potential but also hampers progress towards gender equality more broadly. The gender digital divide exacerbates existing vulnerabilities by reinforcing systemic inequalities that disproportionately affect women, particularly those in marginalised communities. Limited access to digital tools and skills excludes women from opportunities in education, employment, and civic participation, deepening poverty and social exclusion. The lack of representation and participation in the digital economy and technology design also preserves biases, further entrenching gender inequality. To avoid perpetuating such issues, promises on paper need to be translated into concrete action.

17.4 Age Gap

The digital divide disproportionately affects older populations. According to the ITU, younger generations are significantly more likely to use the internet than older populations. Globally, internet usage rates are highest among individuals aged fifteen to twenty-four, reaching over 75 per cent, while fewer than 55 per cent of people aged sixty-five and older are online.Footnote 50 And only around one-third of those aged fifty-five to seventy-four, the retired and the inactive, have at least basic digital skills.Footnote 51 This age-based digital divide (grey digital divide, age gap) limits older adults’ access to vital services, social connections, and opportunities for lifelong learning. As societies digitise, the inability to engage with technology not only marginalises older individuals but also raises human rights concerns. As the EU Agency for Fundamental Rights has noted, ‘[o]lder persons, a heterogeneous group with diverse socio-economic backgrounds, are among those whose enjoyment of fundamental rights might be at risk from digitalisation’.Footnote 52 Their right to participate in civic and public life, the right to work, to health, and to education, can all be impacted by digital exclusion. The age gap exacerbates existing inequalities, as those excluded from digital connectivity face challenges in accessing services, healthcare, and opportunities for social inclusion.

17.4.1 Specific Issues Faced by Older People and Their Human Rights Implications

The age-based digital divide highlights the significant barriers older generations face in accessing and effectively using digital technologies.Footnote 53 The rapid digitalisation of services and social interaction is leaving many older people behind, owing to obstacles such as lack of digital literacy, limited access to devices or internet connectivity, and design biases in technology that cater predominantly to younger users.Footnote 54 Moreover, technophobia and cyberphobia can pose significant self-imposed barriers to engaging with ICT.Footnote 55 This section examines the specific issues stemming from the age gap in digital inclusion and looks at their human rights implications.

FRA emphasises that only one in four people aged sixty-five to seventy-four in the EU 27 have at least basic digital skills, which, along with up-to-date technological tools, are essential to participate in public life.Footnote 56 The right to access to public services is part of the right to good administration protected, for example, under Article 41 of the EU Charter of Fundamental rights.Footnote 57 This includes equal access to public services that are in the process of being digitalised.Footnote 58 As governments and businesses shift services online, older individuals without digital access often struggle to apply for benefits, schedule government appointments, or use banking services. This creates a dependency on others or exclusion from essential services.

The lack of digital skills can also prevent older individuals from accessing the information necessary for informed decision-making. Many voting resources and election updates are primarily available online. Older adults without digital skills or internet access struggle to find essential information about candidates, polling locations, or registration deadlines. This limits their ability to make informed decisions or participate fully in democratic processes. Voter registration, government consultations, and even voting are increasingly moving online, which reduces the ability of older people (without digital skills or access) to participate in such processes, and may end up infringing their right to participate in civic and public life. As much of today’s political mobilisation and discussion occurs in digital spaces, but older individuals with limited digital access are often excluded from these forums, the perspectives of older people end up under-represented.Footnote 59 This exclusion not only diminishes their influence but also perpetuates generational divides in political representation and policymaking.

Older persons may also struggle with accessing digital healthcare services and information. People without internet access miss out on crucial health information, such as vaccination updates and preventive care guidance, exacerbating health inequities, especially in underserved areas.Footnote 60 Telemedicine, vital for remote care and during emergencies such as the COVID-19 pandemic, often excludes older individuals lacking digital skills, leading to delayed diagnoses and untreated conditions. Being unable to use digital healthcare systems, including electronic health records and online appointment platforms, creates further barriers and impacts the ability to manage one’s healthcare effectively. Such problems may end up impacting older persons’ right to health.

The age gap can fuel social isolation by limiting the ability of older adults to connect in an increasingly digital world. Without internet access or digital skills, many miss out on video calls, social media, and online communities that sustain relationships and combat loneliness. Human rights instruments, such as the EU Charter of Fundamental Rights, recognise the ‘rights of the elderly to lead a life of dignity and independence and to participate in social and cultural life’.Footnote 61 But if older people lack digital literacy or access to social media and messaging platforms, they are at a higher risk of social exclusion and loneliness, as family and friends increasingly rely on digital communication to stay connected. This disconnect is especially impactful for those with mobility challenges or in rural areas, where digital tools often replace in-person interactions.Footnote 62 This exclusion from (digital) social life negatively affects mental health, increasing the risk of depression and cognitive decline.Footnote 63

The age-related digital divide can also impact older adults’ rights to education and work. As (adult) education shifts online, those without sufficient digital skills face barriers to lifelong learning and skill development, limiting their ability to adapt in a changing job market. Many older individuals looking to stay in or re-enter the workforce struggle with the technological skills required in many jobs, widening economic inequality and reducing their employability. Similarly, as job applications and interviews are increasingly digital, older adults struggle to access employment opportunities.Footnote 64 Being excluded from online platforms for networking, remote work, and training can deepen economic and social inequalities.

Summing up, the age gap exacerbates age-based discrimination, undermining older people’s human right to participate in civic and social life, the rights to education and work, the right to vote, and the right to health, among others. Without intervention, this digital exclusion deepens systemic inequalities, further marginalising older individuals.

17.4.2 International Action in Relation to the Age Gap

In order to achieve the equitable inclusion of older adults in the digital realm, some international organisations have introduced policies to address this disparity. Yet efforts remain limited and fragmented. High-level policy documents have yet to systematically engage with the unique challenges faced by older individuals because of the digital divide. This section highlights some (sporadic) policy initiatives by international organisations to tackle the age gap specifically; it does not look at broader instruments that address the (human) rights of older persons not limited to the context of digitalisation.

One of the main UN instruments in this area is the Madrid International Plan of Action on Ageing.Footnote 65 The plan emphasises the need to enhance the quality of life of older persons by ensuring their full participation in society, which includes access to ICTs. It encourages the development of programmes to reduce the digital divide and promote digital literacy among older persons.Footnote 66 In 2010, the UN’s Open-Ended Working Group on Ageing was established.Footnote 67 It has advanced the promotion of a rights-based approach towards ageing, but has not paid much attention to addressing the age gap.

In 2022, ministers from the member states of the UN Economic Commission for Europe committed to ‘promoting user-friendly digitalisation, enhancing digital skills and literacy to enable older persons to participate in an increasingly digital world, while also ensuring the right to access to information, participation, and services through access to digital devices and the Internet, and to suitable offline or other secure alternatives in user-friendly and accessible formats’.Footnote 68 However, this is a regional commission, which includes fifty-six member states, so the declaration does not reflect a global consensus. In 2013, the UN Human Rights Council established the mandate of the independent expert on the enjoyment of all human rights by older persons.Footnote 69 One of its annual thematic reports addressed the impact of automation on the human rights of older persons,Footnote 70 but the independent expert has not engaged with the age gap in detail.

The 2020 UN Roadmap for Digital Cooperation addresses the age-related digital divide through its broader focus on inclusivity and equitable digital access.Footnote 71 The roadmap highlights the importance of leaving no one behind, emphasising the need to close gaps in digital access and skills for vulnerable groups (including older people). The roadmap calls for partnerships across governments, private sectors, and civil society to address barriers, such as those faced by older populations in adopting digital technologies. The ITU as a UN specialised agency also has relevant policy goals. Its Connect 2030 Agenda has the ambitious target to bridge all digital gaps, including the age gap.Footnote 72 Other relevant targets include broadband services being affordable to all, broadband access to every household, universal access to the internet by all individuals, the majority of individuals having digital skills, and the majority of individuals accessing government services online. If successful, this would be a significant step towards eliminating the age gap. Yet the targets are very broad and do not have any specific actions or binding commitments attached to them.

The EU’s approach is focused on inclusion in general and does not have many instruments specifically targeting older people (e.g., Europe’s Digital Decade policy programme sets targets such as achieving basic digital skills for 80 per cent of adults by 2030, but does not specify how high this percentage should be among older people).Footnote 73 The main document with a distinct focus on older people and the digital divide is the 2020 Council of the EU conclusions on the human rights, participation, and well-being of older persons in the era of digitalisation.Footnote 74 The conclusions advocate for tailored strategies to enhance digital literacy among older people, improve their access to digital infrastructure, and foster their active engagement in the digital society. But the document is phrased in a very soft manner, with the Council inviting member states and the European Commission to consider, promote, and enable different steps that would improve the situation of older persons.

There are no EU directives or regulations dedicated specifically to protecting the fundamental rights of older persons or addressing the age gap. Two directives that do have a somewhat positive impact on accessibility are the Web Accessibility Directive and the European Accessibility Act.Footnote 75 The former directive obliges states to ensure that public sector websites and mobile apps have specific technical accessibility standards, which are accessible to everybody, including persons with disabilities. And the latter has the aim of improving cross-border trade in accessible products and services between EU Member States. The EU Agency for Fundamental Rights (FRA) has drawn attention to the fundamental rights implications of digital exclusion among older adults, in the particular context of access to public services. Its 2023 report underscores the risk of marginalisation in accessing essential services, including healthcare and social benefits, and advocates for inclusive digital policy frameworks.Footnote 76 Despite the acknowledgement of the issue in the EU, action is lagging.

In general, regional and specialised organisations have not been focusing on the age-related digital divide. Although many organisations have general policies in relation to the digital divide, the specific issues that older people face have not received much attention. Yet the age-related digital divide continues exacerbating existing vulnerabilities among older adults by amplifying their risk of exclusion across multiple domains. It can further marginalise those already disadvantaged by factors such as low income, poor health, or geographic isolation, particularly in rural areas where digital infrastructure is often less developed. Older individuals who lack digital skills or access to technology may struggle to book medical appointments, access telehealth services, or manage financial transactions, leaving them more vulnerable to unmet needs and financial instability. Social vulnerabilities are also intensified as digital technologies have become central to communication and community engagement. Older people without digital literacy are at greater risk of loneliness and social isolation, as family, friends, and community networks have become reliant on digital platforms for connection. This isolation can contribute to mental health issues, such as depression and anxiety, which are already prevalent among older populations.

An aspect to bear in mind when addressing the age gap (and the digital divide in general), is intersectionality. Intersectionality highlights how overlapping vulnerabilities, such as age, gender, socio-economic status, and geographic isolation, compound the impacts of the digital divide.Footnote 77 Older women in rural areas exemplify this, facing barriers from age-based exclusion, entrenched gender norms, and limited infrastructure. These intersecting disadvantages amplify the risk of marginalisation. Addressing the age-related digital divide thus requires policies that account for the complex, intersecting needs of marginalised groups to ensure that digital inclusion efforts are equitable and effective.

In sum, the age-related digital divide magnifies the disparities older adults face, reinforcing cycles of exclusion that intersect with economic, social, and health vulnerabilities. Bridging this divide is not merely a matter of technological advancement but a fundamental requirement for ensuring dignity, autonomy, and inclusion for older individuals in contemporary society. Addressing this issue holistically is essential to mitigating its broader societal impacts and safeguarding the human rights of an ageing population.

17.5 Conclusions

The digital divide represents a critical fault line in the global move toward digitalisation. Despite decades of attention, the problem persists owing to the interplay of systemic factors, including socio-economic disparities, geographic isolation, cultural norms, insufficient policy interventions, and inadequate resources. This chapter has examined the gender and age dimensions of the digital divide, illustrating how these gaps perpetuate and exacerbate exclusion and vulnerability among women and older populations. Bridging this divide requires a more cohesive, enforceable, and inclusive approach that prioritises the voices and needs of marginalised groups. This has been acknowledged by the UN Secretary-General, who has noted that ‘[r]isk factors that affect the ability of vulnerable and marginalized groups to have access to connectivity should be specifically identified and addressed’.Footnote 78 The same has been recognised by the European Parliament, which ‘call[ed] for careful examination of people’s needs when it comes to digital developments and innovation, especially the needs of vulnerable groups, in order to assess how they can benefit from these new technologies’ as ‘the digital transition must take place in a way that benefits everyone’.Footnote 79

Both covered dimensions of the digital divide reflect broader systemic failures to address structural inequalities. While international and regional bodies have adopted policies to address these gaps, their efforts are often inconsistent, fragmented, and lack enforceable commitments. For instance, international instruments such as the UN’s Sustainable Development Goals include digital inclusion targets, but fail to address the problem in a comprehensive manner. Similarly, the EU’s Digital Decade policy programme and other regional initiatives advocate for inclusivity but provide limited mechanisms to enforce digital equity for women and older individuals. The organisations themselves are also calling for more action. Some of the concrete aspects that have been noted as key to bridging the digital divide are ‘better metrics, data collection, and coordination of initiatives’ (UN Secretary-General),Footnote 80 and ‘strengthened enabling policy environments and international cooperation to improve affordability, access, education, capacity-building, multilingualism, cultural preservation, investment and appropriate financing’ (UN Economic and Social Council).Footnote 81 The European Parliament has emphasised the need to design ‘online services in a comprehensible way so that they can be accessed and used by people of all ages and levels of educational attainment’,Footnote 82 and the importance of promoting ‘basic and specialised skills with a specific focus on the most vulnerable groups of people, and the development of education and training systems including lifelong learning, re-skilling and up-skilling’.Footnote 83 Such calls for action have yet to lead to significant results.

Addressing the digital divide is not merely a matter of technological advancement but a profound human rights imperative. Civil society groups such as AGE Platform Europe have emphasised that human rights need to be used as a compass for digitalisation more broadly.Footnote 84 International and regional frameworks must go beyond aspirational targets and implement binding commitments and concrete initiatives that address the specific barriers faced by vulnerable groups. Achieving digital equity is essential not only for fostering individual empowerment but also for advancing broader societal goals of inclusivity, fairness, and human rights in an increasingly digital world.

18 How the EU Safeguards Children’s Rights in the Digital Environment An Exploratory Analysis of the EU Digital Services Act and the Artificial Intelligence Act

18.1 Introduction: Children’s Rights in the Digital Environment

Digital technologies have a significant impact on the lives of children and the rights that are specifically attributed to them by the United Nations Convention on the Rights of the Child (UNCRC), Article 24 of the European Union (EU) Charter of Fundamental Rights (CFEU) and many national constitutions. The Council of Europe Committee of Ministers’ 2018 Recommendation on Guidelines to respect, protect, and fulfil the rights of the child in the digital environment recognises that the digital environment is ‘reshaping children’s lives in many ways, resulting in opportunities for and risks to their well-being and enjoyment of human rights’.Footnote 1 This has also been acknowledged by the United Nations Committee on the Rights of the Child (CRC Committee), which adopted a General Comment on the rights of the child in relation to the digital environment in 2021. In General Comment no. 25, the Committee affirms that ‘[i]nnovations in digital technologies affect children’s lives and their rights in ways that are wide-ranging and interdependent […]’.Footnote 2 Over the years, the EU has been relying on the guidance of the UNCRC when adopting and interpreting fundamental human rights instruments.Footnote 3 This is demonstrated for instance in Article 24 of the CFEU,Footnote 4 which contains language that is very similar to that of the UNCRC. The EU’s commitment to the UNCRC was again confirmed in the 2021 EU Strategy on the Rights of the Child, built on six key pillars of which ‘the digital and information society’ is one.Footnote 5 In a recent case, the Court of Justice of the EU has confirmed that CFEU Article 24 represents the integration into EU law of the principal rights of the child referred to in the UNCRC.Footnote 6 Hence, the UNCRC functions as a comprehensive framework that must duly be taken into account when legislative proposals that directly or indirectly affect children are proposed and adopted.

In the past decade, regulatory action by the European Community (EC) has increasingly been targeted at the digital environment, leading to the adoption of influential legislative instruments such as the General Data Protection Regulation (GDPR),Footnote 7 and the Network and Information Security Directive. Other instruments, such as the Audiovisual Media Services Directive (AVMSD), were amended to extend their scope to also cover video-sharing platforms. Two recent legislative initiatives at the level of the EU, the Digital Services Act (DSA) and the Artificial Intelligence Act (AIA), touch upon platforms and technologies that have a significant impact on children and their rights.Footnote 8 Digital services, often provided through platforms, provide children with immense opportunities to communicate, learn, and play, and artificial intelligence (AI)-based applications may offer children personalised learning or medical treatments.Footnote 9 At the same time, the use of tech platforms and AI may also pose risks to children’s rights. Rights that might be negatively affected are, for example, the right to privacy and data protection, freedom of thought, the right to freedom of expression, and the right to protection from violence and exploitation. The AIA acknowledges, for instance, that this technology ‘can also be misused and provide novel and powerful tools for manipulative, exploitative and social control practices’.Footnote 10 The question arises to what extent the protection and fulfilment of children’s rights is addressed in these most recent legislative acts, the DSA and the AIA. In order to answer this question, the proposals are analysed, the legislative process is scrutinised, and an assessment is made of how each contributes to the effective realisation of children’s rights in the digital realm. The chapter also suggests some strategies for moving forward, drawing on recent recommendations from the UN children’s rights framework. We propose that EU policymakers adopt a children’s rights approach in their attempts to regulate platform services and AI, so that children’s rights and interests can be a strong regulatory focus rather than a regulatory afterthought.

18.2 Opportunities for and Risks to Children’s Rights from Platform Services and AI Systems

Before analysing the legislative acts, this section zooms in on existing evidence about children’s experiences with platform services and AI systems. The aim is to provide a better understanding of both the potential opportunities for and risks to children’s rights offered by these services and applications. Platform services and other AI-based applications have become an integral part of children’s lives. While AI-enabled toys and voice assistants have infiltrated children’s homes and schools, AI-powered tutors, learning assistants, and personalised educational programmes are becoming more commonplace.Footnote 11 Children are also avid users of (commercial) AI-enabled platform services, such as social media, video-sharing, and interactive gaming platforms. For instance, platforms such as Instagram, Snapchat, and TikTok use advanced machine learning to deliver content and personalise, to use their own word ‘improve’,Footnote 12 the user experience, provide filters that rely on augmented reality technology, and employ natural language processing tools to monitor and remove hate speech and other harmful content. The specific features of these platforms and systems make them particularly appealing to children, but also carry risks.Footnote 13 The lack of transparency and insight into how exactly AI systems generate certain output makes it very difficult for end users to anticipate the potential risks, harms, or violations of their rights.Footnote 14

Research capturing the opinions of children and youth themselves about the opportunities and risks of platform services and AI shows that they have a balanced perspective.Footnote 15 On the one hand, they realise that these services offer many opportunities for entertainment, socialising, and learning, but are never completely safe. On the other hand, children express a great deal of concern, and cite as the main risks being confronted with harmful and illegal content, cyberbullying and hate speech, and violations of their privacy and data protection rights.Footnote 16 In relation to this, questions arise about the long-term impact of platform services and AI on children’s well-being and development. For instance, it has been reported that researchers within Instagram, owned by Meta, who studied user experiences, found that ‘thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse’.Footnote 17 The introduction of AI-based applications into children’s lives could also have side effects at the societal level.Footnote 18 More specifically, it could lead to the normalisation of surveillance, datafication, and commercialisation. Many of these applications are driven by commercial interests and deliberately designed and used to ensure maximum engagement of children, and even to establish behavioural patterns and habits for the future.Footnote 19 Furthermore, lower quality AI-based technologies with a greater focus on entertainment and pacification rather than education and learning might be available for children from disadvantaged backgrounds compared with those from privileged backgrounds.Footnote 20

Because of the impact of platform services and AI on society at large, policymakers and legislators around the world are debating and developing instruments to counteract these risks. However, scholars have identified a disconnect and lack of adequate redress between the potential negative impact of AI on children and the regulatory means to address them.Footnote 21 UNICEF also underlines that most of the recent initiatives targeting AI only refer superficially to children and their rights and interests.Footnote 22 Considering the EU’s commitment to safeguarding children’s rights in the digital environment,Footnote 23 the following section will analyse two of these recent initiatives through the lens of children’s rights and principles. It is important to note that both the DSA and the AIA are likely to have a standard-setting impact around the world, given what scholars and policymakers call the Brussels Effect.Footnote 24 In this sense, these initiatives also present an important opportunity to shape global norms and standards for the design and deployment of digital technologies that are used by and impact children.

18.3 Children’s Rights in the DSA
18.3.1 The Commission Proposal for a DSA

The proposal for the DSA aimed to regulate intermediary services and to ‘set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected’.Footnote 25 This proposal reflects a shift at the EU level from relying on self- or co-regulatory efforts from tech companies to imposing strong legislative obligations on those companies that offer services used by a vast number of EU citizens and affect individuals and society at the same time.Footnote 26 Throughout the proposal by the European Commission, children or minors and their rights are referred to a few times. The preamble to the proposal, for instance, states that it will ‘contribute to the protection of the rights of the child and the right to human dignity online’. Recital 34 clarifies that the proposal intends to impose a clear and balanced set of harmonised due diligence obligations on providers of intermediary services, aiming in particular ‘to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users’.Footnote 27

The most important provision for children (Article 26, Recital 57) in the proposal relates to the supervised risk assessment approach towards ‘very large online platforms’ (VLOPs). VLOPs are platforms ‘where the most serious risks often occur’ and which ‘have the capacity to absorb the additional burden’. A platform is considered a VLOP when the ‘number of recipients exceeds an operational threshold set at 45 million; that is, a number equivalent to 10% of the [EU] Union’.Footnote 28 This includes many large platforms that are popular with children, such as YouTube, TikTok, or Instagram. According to the proposal, VLOPs should identify, analyse, and assess any significant systemic risks stemming from the functioning and use made of their services in the Union. All three categories of systemic risks that are listed are especially relevant to children. The first category refers to ‘the dissemination of illegal content through their services’, with a mention in Recital 57 of child sexual abuse material as a type of illegal content. The second category relates to ‘any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child’.Footnote 29 The third category refers to the ‘intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security’.Footnote 30 In order to mitigate the risks, the VLOPs must put in place reasonable, proportionate, and effective measures, tailored to the specific systemic risks (Article 27). Another mechanism that can be used to tackle different types of illegal content and systemic risks are codes of conduct (Article 35). According to the proposal, the creation of an EU-wide code of conduct will be encouraged by the Commission and the new Board for Digital Services to contribute to the proper application of the DSA. Recital 68 refers to the appropriateness of drafting codes of conduct regarding disinformation or other manipulative and abusive activities that might be particularly harmful for vulnerable recipients of the service, such as children.

The explicit references to children in the proposal for the DSA were welcomed by children’s rights organisations but by some considered to be too weak.Footnote 31

18.3.2 Amendments Proposed by the LIBE Committee

During the legislative process, and specifically in the context of the activities of the Committee on Civil Liberties, Justice and Home Affairs (LIBE), a number of child-centric amendments were proposed by a number of LIBE committee members.Footnote 32 Amendment no. 129 introduced a specific reference to Article 24 of the Charter, the UNCRC, and General Comment no. 25 to Recital 3. Amendment no. 412 suggested adding a new Article 12a, requiring the carrying out of a detailed child impact assessment. Amendment no. 414 put forward a specific article on the mitigation of risks to children that aims to address many of the existing concerns regarding children’s rights in the context of VLOPs. The amendment includes, for instance, a reference to taking into account children’s best interests when implementing mitigation measures in general and adapting content moderation or recommender systems in particular; adapting or removing ‘system design features that expose children to content, contact, conduct, and contract risks, as identified in the process of conducting child impact assessments’; ‘proportionate and privacy preserving age assurance’; ensuring ‘the highest levels of privacy, safety, and security by design and default for users under the age of 18’; the prevention of profiling, including for commercial purposes such as targeted advertising; age appropriate terms that uphold children’s rights; and ‘child-friendly mechanisms for remedy and redress, including easy access to expert advice and support’. Amendment no. 427 concerned the publication of child impact assessments and reports about the mitigation measures taken. Finally, amendment no. 772 included a requirement for the Commission to ‘support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child’.

In its Opinion, the LIBE committee only included the amendments regarding Recital 3,Footnote 33 leaving the more substantial amendments out.

18.3.3 Amendments by the Council and European Parliament

Both in the general approach of the Council of November 2021,Footnote 34 and the amendments adopted by the European Parliament (EP) on 20 January 2022, there are remarkably more references to children and minors compared to the Commission proposal.Footnote 35

The general approach by the Council adds that when assessing risks to the rights of the child, ‘providers should consider how easily understandable to minors the design and functioning of the service is, as well as how minors can be exposed through their service to content that may impair minors’ health, physical, mental and moral development’. Risks may arise, for example, ‘in relation to the design of online interfaces which intentionally or unintentionally exploit the weaknesses and inexperience of minors or which may cause addictive behaviour’ (Recital 57). Recital 58 builds on this by requiring that the design and online interface of services primarily aimed at minors or predominantly used by them should consider their best interests and ensure that their services are organised in a way that minors are easily able to access mechanisms within the DSA, including notice and action and complaint mechanisms. Moreover, VLOPs that provide access to content that may impair the physical, mental, or moral development of minors should take appropriate measures and provide tools that enable conditional access to the content. Article 12 refers to explaining the conditions and restrictions for the use of the service in a way that minors can understand, where an intermediary service is primarily aimed at minors or is predominantly used by them. Finally, in Article 27, a reference was added to taking targeted measures to protect the rights of the child, including age verification and parental control tools, or tools aimed at helping minors signal abuse or obtain support.

The EP suggested adding an explicit reference to the UNCRC and General Comment no. 25 in Recital 3. Unlike in the Commission proposal, but reminiscent of guidelines from the Article 29 Working Party,Footnote 36 the EP put forward a prohibition of ‘[t]argeting or amplification techniques that process, reveal or infer personal data of minors or sensitive categories of personal data for the purpose of displaying advertisements’.Footnote 37 A second amendment relates to ensuring that conditions for and restrictions on the use of a service are explained in a way that minors can understand.Footnote 38 This, however, would only be required for intermediary services that are ‘primarily directed at minors or predominantly used by them’. Along the same lines, other amendments aim to ensure the internal complaint-handling systems are easy to access and user-friendly, including for minors, and that online interfaces and features are adapted to protect minors as a measure to mitigate risks.Footnote 39 A more general obligation is proposed to adapt design features to ensure a high level of privacy, safety, and security by design for minors.Footnote 40 Also potentially important for children’s rights was the suggestion to change the wording of ‘any negative effects’ in Article 26 to ‘any actual and foreseeable negative effects’ on the rights of the child. Arguably, this could trigger the application of the precautionary principle.Footnote 41 Regarding the mitigation measures, the EP also proposed to add ‘targeted measures aimed at adapting online interfaces and features to protect minors’ to Article 27. Finally, the suggested Recital 69 encourages the development of codes of conduct to facilitate compliance with obligations regarding the protection of minors, and the proposed Article 34 1a refers to the support and promotion by the Commission of the development and implementation of voluntary standards set by the relevant European and international standardisation bodies aimed at the protection of minors.

18.3.4 The Final Text of the DSA

The DSA, or, as its formal title is expressed, ‘the Regulation (EU) 2022/2065 of the European Parliament and of the Council on a Single Market for Digital Services and amending Directive 2000/31/EC’, was adopted on 19 October 2022 and published in the Official Journal on 27 October 2022.Footnote 42 Not only are the references to child, children, and minors in the adopted text vastly more numerous than in the Commission proposal, but the substance of what is proposed is also much more extensive and arguably promising, depending on actual implementation and enforcement.

These recitals and articles that refer to children and minors can be classified in five broad categories: (a) provisions that are related to child sexual abuse material and the measures in place to tackle this type of material,Footnote 43 (b) transparency obligations regarding terms and conditions, (c) obligations for all online platforms to protect minors,Footnote 44 (d) risk assessment and mitigation obligations towards children for VLOPs and very large online search engines (VLOSEs),Footnote 45 and (e) references to implementation tools such as standards and codes of conduct.

In what follows, the final three categories are examined in depth.

18.3.4.1 Obligations for All Online Platforms to Protect Minors

Article 28 (an article not included in the Commission proposal) formulates extensive obligations for online platforms ‘accessible to minors’. First, such platforms must put in place ‘appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service’. This is an obligation that has the potential to safeguard a number of children’s rights (i.e., the right to privacy and the right to protection). One point of contention is the interpretation of which platforms are considered ‘accessible to minors’. This is in part clarified in Recital 71, which states that this includes (a) platforms whose terms and conditions permit minors to use the service, (b) platforms offering services that are directed at or predominantly used by minors, or (c) platforms that are aware that some of the recipients of their service are minors, ‘for example, because it already processes personal data of the recipients of its service revealing their age for other purposes’. In reality, research shows that many children (including very young children) use platforms that are not directed at them and that explicitly state in their terms and conditions that their service is not to be used by children under a certain age (most often set at thirteen years).Footnote 46 It may be expected that certain platforms will try to argue that their platforms should not be considered to be ‘accessible to minors’. Independent research into children’s online experiences and use of platforms might be helpful in this regard, both for the platforms and for oversight bodies. Aside from this issue in relation to which platforms fall within the scope of Article 28, it might also be a challenge for platforms to decide what are ‘appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors’, particularly when considering that different age groups of children have different privacy and safety needs. In this regard, Recital 71 refers to standards, codes of conduct, and best practices, and Article 28.4 indicates that the Commission (after consulting the European Board for Digital Services, which is established by the DSA in Article 61) may formulate guidelines to support providers of online platforms. Work on such guidelines started in 2024.Footnote 47

Second, Article 28.2 prohibits targeting advertisements based on profiling ‘when they are aware with reasonable certainty that the recipient of the service is a minor’. Such types of advertisements have long been a concern for scholars,Footnote 48 and organisations such as the Article 29 Working Party, which in one of its Guidelines had stated before that ‘organisations should, in general, refrain from profiling [children] for marketing purposes’,Footnote 49 even though this was not explicitly prohibited by the GDPR. In this light, it can only be commended that this is now also explicitly prohibited in the DSA. Yet the question could be raised whether it would not have made sense to include a broader prohibition of profiling children for commercial purposes rather than just targeted advertising. This would have been more in line with the CRC Committee’s call to ‘prohibit by law the profiling or targeting of children of any age for commercial purposes’ in its General Comment no. 25.Footnote 50 Moreover, profiling may also be used to target harmful types of content (e.g., relating to self-harm or eating disorders). Arguably, this could be covered under the risk assessment provisions, but their scope is limited to VLOPs and VLOSEs (see Section 3.5.2). Another doubt that may be raised is how the notion ‘reasonable certainty’ will be interpreted and whether this would require age verification. It would seem from the text of the DSA that this is not necessarily the case, as Article 28.3 states that compliance with the obligations of Article 28 ‘shall not oblige providers of online platforms to process additional personal data in order to assess whether the recipient of the service is a minor’, and Recital 71 adds that this obligation ‘should not incentivise providers of online platforms to collect the age of the recipient of the service prior to their use’. While this may be in line with the principle of data minimisation laid down in the GDPR,Footnote 51 it is uncertain whether the protective aim of the prohibition as well as taking measures to ensure a high level of privacy, safety, and security of minors will be effectively realised if platforms are not incentivised to know which users are actually minors. In any case, the desirability and effectiveness of age verification or age assurance has been the subject of heated debates since the emergence of the internet, and until now, this debate has not been settled.

18.3.4.2 Risk Assessment and Mitigation Obligations towards Children for the VLOPs and VLOSEs

A third category of obligations is aimed at VLOPs and VLOSEs. The first VLOPs and VLOSEs were designated by the EC on 25 April 2023. They include platforms and search engines that are hugely popular with children such as TikTok, Snapchat, Instagram, Wikipedia, Google Play, and Google Search.Footnote 52 Article 34 requires VLOPs and VLOSEs to undertake an assessment of the systemic risks in the EU ‘stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services’. There are four categories of risks that are listed, and as mentioned earlier, at least three of these types of risks are directly relevant for children: ‘(a) the dissemination of illegal content through their services’; ‘(b) any actual or foreseeable negative effects for the exercise of fundamental rights, including rights of the child’, and ‘(d) any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being’. From a children’s rights perspective, it is particularly interesting to observe that in the course of the legislative process, the word foreseeable was added, as this could potentially trigger the precautionary principle. There is still much uncertainty about the occurrence and extent of actual harm when it comes to digital technologies.Footnote 53 There are certain indications that certain platform services might have an impact on the mental health of children in the short and long term,Footnote 54 and that certain features lead to an impact on day-to-day life and, for instance, sleep. Yet there is still little hard evidence about certain impacts on children and their rights in the long run. There is simply not enough research, there are ethical questions that surround research with children, and certain technologies have not existed long enough to draw conclusive results. We have argued before that with respect to delicate issues such as the well-being of children, the precautionary principle might thus come into play.Footnote 55 Simply put, this concept embraces a better safe than sorry approach and compels society to act cautiously if there are certain – not necessarily absolute – scientific indications of a potential danger and not acting upon these indications could inflict harm.Footnote 56 It could of course be up for discussion whether the threshold for triggering the precautionary principle and the threshold for an effect to be foreseeable in the sense of Article 34 are in alignment. From a linguistic point of view, foreseeable does not equate with potential. A foreseeable event is, according to the Cambridge Dictionary, ‘one that can be known about or guessed before it happens’. Whether an effect can be known about will to a large extent depend on research and expert advice from a variety of disciplines. From a children’s rights perspective, however, this notion would need to be interpreted broadly in the best interests of the child, in line with UNCRC Article 3 and CFEU Article 24.

As to the methodology for the assessment of risks and their effect on the rights of the child, inspiration could be drawn from existing methodologies for Children’s Rights Impact Assessments (CRIAs).Footnote 57 With regard to CRIAs, best practices are available that could be useful. From a children’s rights perspective, in any case, it is crucial that the impact on the full range of children’s rights is assessed, and rights are not looked at in isolation but as interacting with each other.

Following the assessment of the risks, Article 35 requires VLOPs and VLOSEs to take ‘reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks’. One type of such mitigation measure that is proposed is ‘targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support’. The explicit reference to targeted measures is helpful. Recital 89 seems to indicate that such targeted measures might for instance be needed to protect minors from content that may impair their physical, mental, or moral development. The examples that are given could be helpful as well, although both age verification and parental control tools are not without their difficulties. Regarding age verification, the lack of consensus on desirability and effectiveness has already been pointed to; regarding parental control tools, it has been argued before that these types of empowerment tools should not be used to solely shift the burden for safeguarding the interests and rights of children from platforms to parents.Footnote 58 In addition, other non-child specific risk mitigation measures that are proposed, such as adapting the design, features, or functioning of services, including online interfaces, may also have a positive impact on children’s rights. Recital 89 explains in that regard that VLOPs and VLOSEs must take the best interests of the child into account when taking such measures, particularly when their services are aimed at minors or predominantly used by them.

18.3.4.3 Standards and Codes of Conduct

Finally, as the formulation of the obligations that are imposed on VLOPs and VLOSEs still remains rather abstract, the actual implementation of them will be of the utmost importance. The tools that can support platforms in that regard are standards and codes of conduct.

Article 44 states that the Commission, after consultation with the Board, shall support and promote the development and implementation of voluntary standards set by relevant European and international standardisation bodies, including in respect of ‘targeted measures to protect minors online’. In this regard, it is relevant to observe the efforts that are currently being undertaken by the Institute of Electrical and Electronics Engineers (IEEE) regarding the drafting of a standard for Age Appropriate Design for Children’s Digital Services.Footnote 59

Recital 104 explains that an area for consideration for which codes of conduct could be drafted (Article 45) is ‘the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities or any adverse effects on minors’. In the Commission’s 2022 BIK+ Strategy, it was announced that the Commission will ‘facilitate a comprehensive EU code of conduct on age-appropriate design, building on the new rules in the DSA and in line with the AVMSD and GDPR. The code aims to ensure the privacy, safety and security of children when using digital products and services. This process will involve industry, policymakers, civil society and children.’Footnote 60 It continues:

[u]nder the DSA, the Commission may invite providers of very large online platforms to participate in codes of conduct and ask them to commit themselves to take specific risk mitigation measures, to address specific risks or harms identified, via adherence to a particular code of conduct. Although participation in such codes of conduct remains voluntary, any commitments undertaken by the providers of very large online platform shall be subject to independent audits.

At the end of 2022, a call was published by the Commission for members for a Special Group on the EU Code of Conduct on Age-Appropriate Design.Footnote 61 However, work on the Code of Conduct seems to have halted in favour of the drafting of guidelines by the European Commission (supra).

18.4 Children’s Rights in the Proposal for the AIA
18.4.1 The EU Policy Agenda on AI (and Children’s) Fundamental Rights

A second legislative initiative at the EU level that has the potential to significantly impact children’s rights in the digital environment is the AIA. Developing a regulatory framework for AI has been high on the EU policy agenda for some time. Initially, the EC adopted a soft-law approach consisting of non-binding recommendations and ethical guidelines. In June 2018, an independent High-Level Expert Group on Artificial Intelligence (AI HLEG) was established, which was tasked with drafting ethics guidelines for AI practitioners, as well as offering advice concerning the adoption of policy measures.Footnote 62 However, this approach changed in 2021, when the EC explicitly recognised that certain characteristics of AI, such as the opacity of algorithms and the difficulties in establishing causality in algorithmic decision-making, pose specific and potentially high risks to fundamental rights.Footnote 63 As existing legislation failed to address these risks, both the EP and the Council called for legislative action in this area.Footnote 64 Echoing these calls, the AI HLEG also stated the need to explore binding regulation to tackle some of the critical issues raised by AI. In particular, the expert group stressed the need for mandatory traceability, auditability, and ex ante oversight obligations for AI systems that have the potential to significantly impact human lives. According to the AI HLEG coordinator, AI is nothing more than an application, system, or tool developed by humans that can be used in different ways: (a) ways that cause harm, (b) ways that cause unintended harm, (c) ways that counteract harm, and (d) ways that cause good.Footnote 65 Therefore, ‘if we are intelligent enough to create AI systems, we must be intelligent enough to ensure appropriate governance frameworks that harness the good use of those systems, and avoid those that lead to (un)intentional harm’.Footnote 66 In its framework for achieving Trustworthy AI, the AI HLEG also pays (limited) attention to children’s rights. More specifically, as part of the key ethics guidelines, the AI HLEG advises to pay particular attention to situations involving more vulnerable groups,Footnote 67 such as children, and refers specifically to CFEU Article 24.Footnote 68

The fact that the regulation of AI is a hot topic is evidenced by the responses to the EC’s open consultation on AI in February 2020, which attracted far more feedback submissions than any other technology Act.Footnote 69 In these submissions, concerns about AI and children were raised by a range of stakeholders, including companies, academic institutions, and non-governmental organisations (NGOs), mostly in relation to education. For instance, the submissions mentioned that the use of AI in education may have serious consequences on the child’s life course and should therefore be considered as high risk, as it may lead to discrimination, have a serious negative impact on their learning, and their consent might not be properly secured.Footnote 70 More generally, children’s digital rights and AI,Footnote 71 and the use of AI in connection with the evaluation, monitoring, and tracking of children were also mentioned as areas of concern.Footnote 72

18.4.2 The Commission Proposal for an AI Act

In response to these calls for legislative action, in April 2021 the EC unveiled its proposal for the AIA, the first Act of its kind in the world.Footnote 73 It aimed to lay down harmonised rules on the development, deployment, and use of AI systems in the EU,Footnote 74 based on the values and fundamental rights of the EU.Footnote 75 Through a risk-based approach and by imposing proportionate obligations on all participants in the value chain, the proposal aimed to ensure a high level of protection of fundamental rights in general and a positive impact on the rights of special groups, including children.Footnote 76 More specifically, a risk-based categorisation of AI systems was introduced, where different levels of risk correspond to a different set of requirements. The intensity of risks determines the applicability of the requirements, and as such, a lighter regime applies to AI systems with minimal risks, while unacceptable risks are prohibited. The idea is that (groups of) individuals at risk and vulnerable to health, safety, and rights infringement by new AI developments need a higher level of protection.Footnote 77

One category of prohibited practices as proposed by the Commission that is relevant for children are ‘practices that exploit vulnerabilities of specific vulnerable groups such as children or persons with disabilities in order to materially distort their behaviour in a manner that is likely to cause them or another person psychological or physical harm’, because such systems contradict Union values, for instance, by violating fundamental rights (emphasis added).Footnote 78 This is confirmed in Recital 16 and Article 5.1 (b) of the proposal, although the latter does not explicitly refer to children but to ‘a specific group of persons due to their age’. As an example of such an exploitative AI system, the EC referred to a doll with an integrated voice assistant, which, under the guise of a fun or cool game, encourages a minor to engage in progressively dangerous behaviour or challenges.Footnote 79 While this is an extreme example, children’s rights advocates argued that a number of persuasive design features often found in AI systems used by children could fall under this prohibition.Footnote 80 They cited, for example, the autoplay features of social media companies that aim to increase user engagement, and which could be said to affect children’s sleep and education, and ultimately their health and well-being.Footnote 81 Moreover, when such recommender systems promote harmful content, they might even lead to sexual exploitation and abuse.Footnote 82 Nevertheless, the prohibition was criticised by various stakeholders for its limitations, in particular the limitation to physical and psychological harm,Footnote 83 the requirement of malicious intent,Footnote 84 and the lack of references to fundamental rights.Footnote 85

The Commission proposal also mentions children and their rights in the context of the classification of AI systems as high risk and the related requirements for the provision or use of such systems. According to Recital 28 of the proposal, the extent of the adverse impact on fundamental rights caused by AI systems is of particular relevance when classifying them as high risk.Footnote 86 In this regard, Recital 28 contains an explicit reference to CFEU Article 24, which grants children the right to such protection and care as is necessary for their well-being. Moreover, it mentions the UNCRC and the recently adopted General Comment no. 25 on children’s rights in the digital environment,Footnote 87 which sets out why and how State parties should act to realise children’s rights in a digital world. On reflection, however, this does not mean that the proposal considers AI systems that are likely to be accessed by children or impact upon them to be considered high risk by default. The Commission proposal also does not impose any real obligation on providers or users of high-risk AI systems to carry out and publish an assessment of the potential impact on children’s rights. Instead, providers of high-risk AI systems will have to conduct a conformity assessment,Footnote 88 to demonstrate compliance with a list of essential requirements, before placing the system on the market or putting it into service.Footnote 89 These requirements include setting up a risk management system;Footnote 90 ensuring that the data sets used comply with data quality criteria; obliging providers to guarantee accuracy, robustness, and data security; preparing technical documentation; logging; and building in human oversight to minimise risks to health, safety, and fundamental rights.Footnote 91 Regarding the latter, human–machine interface tools and measures should be integrated into the design of the AI system.Footnote 92 Users of high-risk AI systems must use the systems in accordance with the provider’s instructions for use. While this seems like a solid set of requirements, how the full implementation of the risk management as described in Article 9 of the AIA proposal can be ensured without a real obligation to first identify and evaluate risks to children could still be questioned. In addition, the Commission proposal lacks individual rights and remedies against infringements by the provider or user, in contrast to, for instance, data subject rights under the GDPR.Footnote 93

Finally, the Commission proposal also contains transparency requirements that apply to some specific limited-risk AI systems. This category essentially covers systems that can mislead people into thinking they are dealing with a human (e.g., automated chatbots such as the ‘My AI’ tool used by Snapchat).Footnote 94 The proposal requires AI providers to design their systems in such a manner that individuals interacting with these systems are informed that they are interacting with a bot (i.e., ‘bot disclosure’) unless it is contextually obvious. Second, the proposal requires users of emotion recognition systems to inform exposed persons of this and users of AI systems that generate deep fakes to disclose the AI nature of the resulting content. These transparency requirements raised a number of questions (e.g., does this mean that there is a right to an explanation?),Footnote 95 including about the standard of transparency when children are involved. More specifically, do providers have an obligation to offer information in a child-friendly manner – similar to the GDPR transparency obligations – when their AI systems are likely to be accessed by a child? This remained unclear in the Commission proposal.

18.4.3 Amendments Proposed by the IMCO and LIBE Committees

The discussions in the EP were led by the Committee on Internal Market and Consumer Protection (IMCO) and the LIBE under a joint committee procedure.Footnote 96 Additional references to children were added in the IMCO-LIBE draft report.Footnote 97 The first was amendment 208, which proposed a requirement for the future EU advisory body on AI, the so-called AI Board, to provide guidance on children’s rights, in order to ‘meet the objectives of this Regulation that pertain to children’. Second, and perhaps more interesting, was amendment 289, which added to the list of high-risk AI systems ‘AI systems intended to be used by children in ways that have a significant impact on their personal development, including through personalised education or their cognitive or emotional development’. Amendment 23 specified in this context that children constitute ‘an especially vulnerable group of people that require additional safeguards’. Depending on how broadly this category is interpreted (e.g., does it go beyond an educational context?), this could lead to stronger protection. The draft report also did not contain a general obligation for specific protection for children in the context of AI.

Furthermore, in a public event following the publication of the Commission proposal, one of the shadow rapporteurs of the IMCO Committee on the proposal for an AIA openly criticised the fact that the Commission proposal contained no obligation to carry out fundamental rights impact assessments.Footnote 98 In this regard, amendment 90 of the draft report specified that the obligation of risk identification and analysis for providers of high-risk AI systems should also include the known and reasonably foreseeable risks to the fundamental rights of natural persons.Footnote 99 In addition, the shadow rapporteur argued that the Commission proposal overlooked the fact that AI systems that are transparent and meet the conformity requirements – and can thus move freely on the market – could still be used in violation of fundamental rights. This criticism was reflected in the draft report, which underlined that ‘users of high-risk AI systems also play a role in protecting the health, safety and fundamental rights of EU citizens and EU values’,Footnote 100 and placed more responsibilities on the shoulders of said users.Footnote 101

18.4.4 Amendments by the Council and the EP

Both the general approach of the Council and the amendments adopted by the EP introduced a number of noteworthy changes.

The Council adopted its common position (General Approach) on 6 December 2022, which includes several noteworthy elements concerning children and their rights.Footnote 102 First, as Malgierie and Tiani argue, it adopted a wider and more commercially relevant definition of vulnerability.Footnote 103 More specifically, the Council proposed to prohibit not only the exploitation of age, but also of vulnerabilities based on disability, and the social or economic situation of the individual. This was an improvement in light of children’s rights concerning accessibility and protection from economic exploitation. The Council also deleted the malicious intent requirement, and included the possibility that harms may be accumulated over time (Recital 16), thereby resolving some of the criticisms mentioned earlier. In addition, more attention was given to fundamental rights more generally in the context of the requirements for providers of high-risk AI systems. Regarding classification, the Council proposed that AI systems that are unlikely to cause serious fundamental rights violations or other significant risks should not be classified as high risk. Regarding the requirements for providers of high-risk systems, the Council text included a requirement for the ‘identification and analysis of the known and foreseeable risks most likely to occur to health, safety, and fundamental rights in view of the intended purpose of the high-risk AI system.’Footnote 104

Following lengthy discussions on the more than 3,000 amendments tabled in response to the draft report by the IMCO-LIBE committees, the EP plenary session adopted its negotiating position (Compromise Text) on 14 June 2023.Footnote 105 However, despite numerous amendments being tabled with the potential to directly impact children and their rights, none of these child-specific amendments were included in the Compromise Text of the EP. Consequently, these amendments were not part of the trilogue negotiations.Footnote 106 In relation to this, children’s rights organisations raised concerns about the level of consideration given to children’s rights during the legislative process.Footnote 107 Nevertheless, the Compromise Text does include several notable amendments that could impact children and their rights. First, it included a ban on AI systems inferring the emotions of a natural person in education institutions, which has implications for school children.Footnote 108 Second, the EP proposed to include as part of the risk management system for providers of high-risk AI systems a requirement to identify, estimate, and evaluate known and reasonably foreseeable risks to fundamental rights (including children’s rights). Third, the introduction of general principles applicable to all AI systems under the proposed Article 4a is noteworthy. This article requires operators of AI systems to make their best efforts to develop and use these systems in accordance with these principles. The principles encompassed various aspects, including the preservation of human agency and oversight, technical robustness and safety, privacy and data governance, transparency, diversity, non-discrimination and fairness, as well as social and environmental well-being. To foster the voluntary application of these principles to AI systems other than high-risk AI systems, the EP proposed the establishment of codes of conduct. These codes should particularly ‘assess to what extent their AI systems may affect vulnerable persons or groups of persons, including children, the elderly, migrants and persons with disabilities or whether measures could be put in place in order to increase accessibility, or otherwise support such persons or groups of persons’.Footnote 109 Finally, a new Article 4d outlined requirements for the EU, its Member States, as well as providers and deployers of AI systems to promote measures fostering AI literacy, which could be beneficial for children. This included teaching basic notions and skills regarding AI systems and their functioning.

18.4.5 The Final Text of the AIA

The final text of the AIA was adopted by the EP on 13 March 2024 and endorsed by the Council on 21 May 2024.Footnote 110 The specific references to children and their rights have remained, with noteworthy changes.

First, with regard to the prohibited practices, Article 5b now states:

the placing on the market, the putting into service or the use of an AI system that exploits any of the vulnerabilities of a natural person or a specific group of persons due to their age, disability or a specific social or economic situation, with the objective, or the effect, of materially distorting the behaviour of that person or a person belonging to that group in a manner that causes or is reasonably likely to cause that person or another person significant harm [emphasis added].

Thus, the final text does not contain a malicious intent requirement (i.e., ‘with the objective, or the effect’), and adopts the broader concept of vulnerability (‘disability, ‘social or economic situation’). Furthermore, Article 5 now covers ‘significant harm’ to ‘that person or another person’, extending beyond physical or psychological harm (infra).Footnote 111 However, a lack of clarity regarding the actual scope of the prohibition remains. A crucial point that needs clarification concerns the threshold for significant harm. For instance, would a violation of children’s rights meet this threshold? According to Recital 29, this may include harms accumulated over time. In addition, this provision is rather broad regarding who suffers harm and seems to cover third-party effects as well.Footnote 112

Second, the references to children’s rights in the context of the classification (AIA Recital 48) and requirements for (AIA Article 9.9) high-risk AI systems have also remained, with some subtle changes. At first glance, these references give the impression that the EU considers that children’s rights and their best interests play an important role in the regulation of high-risk AI systems. Article 9.9 of the AIA, for example, states that ‘when implementing the risk management system as provided for in paragraphs 1 to 7, providers shall give consideration to whether in view of its intended purpose the high-risk AI system is likely to have an adverse impact on persons under the age of 18’ (emphasis added). However, as mentioned, this does not mean that AI systems that are likely to be accessed by children or impact them are considered high risk by default. Notably, the word specific was omitted from the final text, arguably reducing the emphasis compared with the EC proposal. The AIA classifies all systems used within a list of predetermined domains as high risk.Footnote 113 A distinction is made between two sub-categories of AI systems: (a) AI systems that are products or safety components of products that are already covered by EU health and safety legislation, and (b) standalone AI systems used in specific (fixed) areas.Footnote 114 Regarding the latter, one of the areas included that is particularly relevant for children is educational and vocational training – both in terms of determining access to such training and evaluating individuals.Footnote 115 This could include, for example, AI-enabled online tracking, monitoring, and filtering software on educational devices, which could have a chilling effect on children’s right to freedom of expression or violate their right to privacy. This is reminiscent of the Ofqual algorithm debacle in the United Kingdom, where an automated decision-making system was employed to calculate exam grades, leading to discriminatory outcomes for children from lower socio-economic backgrounds.Footnote 116 Such systems can clearly violate children’s right to education, as well as their right not to be discriminated against and perpetuate historical patterns.Footnote 117 Another area where AI systems are likely to present high risks to children (as well as adults) is in the allocation of public assistance benefits and services.Footnote 118 Recital 37 specifies that, owing to the vulnerable position of persons depending on such benefits and services, AI systems used in this context may have a significant impact on the livelihoods and rights of the persons concerned – including their right to non-discrimination and human dignity. A concrete example is the so-called benefits scandal in the Netherlands, where an AI-enabled decision-making system withdrew and reclaimed child benefits from thousands of families, disproportionally affecting children from ethnic minority groups.Footnote 119 Aside from these two areas, Annex III lists biometric identification, categorisation, and emotion recognition; management and operation of critical infrastructure; employment; law enforcement; migration; and administration of justice and democratic processes as high-risk areas. The EC can also add sub-areas within these areas (subject to a veto from the EP or Council).Footnote 120 However, other domains where AI systems and automated decision-making are employed would not be considered high risk, even if they are likely to be accessed by children or impact them or their fundamental rights. This leaves out a whole range of AI systems that could affect the daily lives of children, such as online profiling and personalisation algorithms, connected toys, and content recommender systems on social media.Footnote 121

The final text includes only voluntary commitments for low-risk AI systems. Given the rapid development of AI technologies and how difficult it is at this stage to imagine the future impact on children and their rights, this feels like a very light regulatory regime. A more cautious approach – based on the precautionary principle – could have been to include a binding set of general principles (supra), including the best interests of the child (similar to recital 89 of the DSA), fairness, and non-discrimination for all AI systems in the AIA.Footnote 122

With regard to the transparency requirements for certain AI systems, the final text includes a specific reference to children – or at least to ‘vulnerable groups due to their age’. Article 50 of AIA states that AI systems should be designed so that individuals are informed that they are interacting with an AI system, unless it is contextually obvious. In relation to this, Recital 132 of AIA specifies that when implementing this obligation, the characteristics of vulnerable groups owing to their age or disability should be taken into account, if these systems are intended to interact with those groups as well.Footnote 123

Finally, the final text also grants rights to individuals (including children) affected by the output of AI systems, including the right to lodge a complaint before a supervisory authority,Footnote 124 and the right to an explanation in certain instances.Footnote 125 However, children are not specifically mentioned in these provisions.

18.5 Discussion: A Children’s Rights Perspective on the DSA and the AIA

It can only be welcomed that both instruments refer to children and their rights. The question is, however, whether the proposals have the potential to ensure that children’s rights will be effectively implemented. For the DSA, the answer is quite clear: it holds great promise for advancing children’s rights, depending on the actual implementation and enforcement. Where references to children in the proposal were few and far between in the Commission proposal, the final text appears to take children and their interests (finally) seriously by imposing obligations on VLOPs that could make a difference. Moreover, in addition to the provisions that directly refer to children that have been discussed earlier, there are of course other provisions that will indirectly have an impact on children as a subgroup of individuals in general. Examples are the provisions regarding recommender systems (Articles 27 and 38) and the design of online interfaces (Article 25). While the text of the law provides indeed many opportunities, the actual realisation of children’s rights will depend on the implementation and enforcement. The DSA does create enforcement mechanisms and oversight bodies that are responsible for ensuring this.Footnote 126 In 2024, the Commission already launched formal proceedings against, among others, TikTok and Meta (Facebook and Instagram) related to the protection of minors.Footnote 127 The Commission expresses concerns about, for example, age verification, default privacy settings, and behavioural addictions that may have an impact on the rights of the child. It thus seems that the enforcement of the DSA will move forward faster than, for instance, the enforcement of the GDPR.

For the AIA, it remains to be seen whether it will succeed in guaranteeing children’s rights. Children’s rights are mentioned in the Preamble and provisions of the Act – which is laudable – and it clearly acknowledges the potential negative impact of AI systems on their rights. However, whether these acknowledgements are sufficient for protecting and promoting children’s rights in an increasingly AI-driven world is questionable. First, the prohibition of AI systems that exploit vulnerable groups leaves questions about the threshold for significant harm and its interplay with other instruments. Second, while the final text mentions that the impact of AI systems on children’s rights is considered ‘of particular relevance’ when classifying them as high risk,Footnote 128 As a final consideration, the AIA does not explicitly introduce a general obligation for ‘specific protection’ for children when AI systems and automated decision-making infiltrate their lives – in contrast to, for instance, the GDPR when it comes to the processing of their personal data (recital 38) or – arguably – the DSA requirement to ensure a high level of privacy, safety, and security for minors. By introducing an obligation to ensure the best interests of children for all AI systems that are likely to impact children, this could have led to more effective rights implementation in practice.

From a children’s rights perspective, a few questions remain. First, the adoption of any legislative initiative that affects children should be accompanied by a thorough Children’s Rights Impact Assessment (CRIA).Footnote 129 Although both proposals have been preceded by an impact assessment, it can hardly be said that they would qualify as CRIAs. A true CRIA would assess the impact of the proposed measures on the full range of children’s rights and would balance conflicting rights (both where children’s rights would conflict with each other and where children’s rights would conflict with the rights of adults, businesses, or other actors). A CRIA is also the way to evaluate whether a proposed measure takes the best interests of the child as a primary consideration. The best interests of the child is a key children’s rights principle, which is laid down in Article 3 of UNCRC and Article 24 of CFEU. This principle should guide all actions and decisions that concern children. This is also very closely linked to another children’s rights principle, laid down in Article 12 of UNCRC, which is the right to be heard. Children’s views must be taken on board and must be given due weight. Although in the preparatory steps leading towards the adoption of the proposals, children’s rights organisations have had opportunities to share their views and suggestions, this does not replace the actual engagement of children in that process. This is – again – a lesson to be learnt. Moreover, CRIAs might also be helpful for the addressees of the legislative acts, when assessing risks that their services or systems pose for children and their rights.

Second, experiences with other legislative instruments, such as the GDPR, have shown that the vague wording in the legislative instruments leaves addressees often at a loss for how to implement obligations (notwithstanding their often-well-meant intentions). Hence, fast and concrete guidance,Footnote 130 for instance, by means of Commission Guidelines (Article 28.4 DSA), codes of conduct or guidelines by the newly established European Board for Digital Services or the European Artificial Intelligence Board will be essential. In addition, whereas enforcement by Data Protection Authorities of the GDPR has been argued to be slow, lacking, or not prioritising children, it will be up to Member States to ensure that the DSA and AIA oversight bodies are well resourced, and it will be up to the Commission to take up its supervisory role when it comes to the VLOPs and VLOSEs.

Finally, both instruments adopt an approach that predominantly focuses on safety and risks. There are few to no obligations for platforms to take measures to support children, to enable them to use the services to fully benefit them, and explore the opportunities that such services offer. Although this is for instance something that the BIK+ Strategy has more attention for, it is perhaps a missed opportunity to put into practice some of the more positive measures the General Comment no. 25 requires States to take.

18.6 Conclusion

The EU legislator is determined to minimise risks that are posed by platforms and AI-based systems by imposing various obligations on a range of actors. While many of those obligations are not child-specific, some of them are. Children who grow up today in a world where platforms and AI-based systems are pervasive might be particularly vulnerable to certain risks. The EU legislator is aware of this and pays increasing attention to the risks to children and their rights, although this is not necessarily the case to the same extent across different legislative initiatives. While the DSA emphasises the protection of minors quite heavily, the AIA is less outspoken. Both during the legislative process, and in the stages of implementation and enforcement, the rights and principles contained in the UNCRC should be duly taken into account in order to effectively realise children’s rights in the digital environment.

19 Right to Education in Regional or Minority Languages Invasions, COVID-19 Pandemic, and Other Developments

19.1 Introduction

How can we define linguistic rights and their role in human rights protection? According to the United Nations (UN) Special Rapporteur on minority issues: ‘Linguistic rights can be described as a series of obligations on state authorities to either use certain languages in a number of contexts, or not interfere with the linguistic choices and expressions of private parties.’Footnote 1 Linguistic rights include the right to use one’s language in private and public and not to be discriminated against for it. In respect to education, it is usually the competence of the state to organise this and set up rules for public but also for private schools. The school curricula prescribe the basic values and goals pursued by education and also the language of instruction. Could it be claimed that states have a duty to provide education in the minority languages used by portions of its population?Footnote 2

The protection of the linguistic rights of minorities, including in education varies. We shall compare this protection at the level of the UN and the Council of Europe and look into some more recent developments in international law including during the COVID-19 pandemic.

19.2 UN: A House Much Divided

Linguistic rights are endorsed in a number of international human rights treaties and standards, especially regarding the prohibition of discrimination, the right to freedom of expression, the right to a private life, the right to education, and the right of linguistic minorities to use their own language with others in their group. These treaties and standards are both universal and regional. The most significant contributor at the universal level is, of course, the UN with its many declarations and opinions. In that respect, the most important is the Declaration on the Rights of Persons Belonging to National or Ethnic, Religious or Linguistic Minorities, adopted in 1993.Footnote 3 Even though the General Assembly’s declarations are not legally binding, they still have a certain legal authority. According to Article 1 of the Declaration, ‘States shall protect the existence and the national or ethnic, cultural, religious and linguistic identity of minorities within their respective territories.’ Consequently, linguistic identity is recognised to be an integral part of the identity of minorities and must be protected. Nevertheless, in Article 4 that speaks about education, provisions for learning and instruction in the mother tongue (para. 3) ‘are qualified and ambiguous’ when it comes to the teaching of or in the minority language.Footnote 4

However, perhaps not so surprisingly, today there is still no binding treaty at the UN level on the rights of minorities nor is there a universally accepted definition of minorities other than the famous Article 27 of the 1966 International Covenant on Civil and Political Rights (ICCPR).Footnote 5 Article 27 refers to the duty of state parties ‘not to deny’ minorities some fundamental (collective and individual) human rights. The monitoring body under this treaty – the Human Rights Committee – provided an interpretation of this article as requiring also positive measures ‘to ensure that the existence and the exercise of this right are protected against their denial or violation. Positive measures of protection are, therefore, required….’Footnote 6 Positive measures by state parties may also be necessary to protect the identity of a minority and the rights of its members to enjoy and develop their culture and language.Footnote 7 The ICCPR does not deal with the right to education but it left this to the complementary International Covenant on Economic, Social and Cultural Rights, also adopted by the UN General Assembly in 1966.Footnote 8 This treaty recognises the right of everyone to education. Its Article 13 paragraph 4 reserves ‘the liberty of individuals and bodies to establish and direct educational institutions’, which may be construed (!) as including the right of minorities to run their own schools and teach (in) their own language, especially when read in conjunction with Article 2.2 (non-discrimination clause).Footnote 9 State parties are obliged to establish ‘minimum educational standards’ to which all educational institutions established in accordance with Article 13 (3) and (4) are required to conform. UNESCO – a specialised UN agency for education, science and culture – is more specific about the right to have education in one’s own language. The Convention against Discrimination in Education provides for the ‘rights of members of national minorities’ in Article 5.1.c., including the use or the teaching of their own language.Footnote 10 This right, nevertheless, depends on some requirements:

(i) That this right is not exercised in a manner which prevents the members of these minorities from understanding the culture and language of the community as a whole and from participating in its activities, or which prejudices national sovereignty; (ii) That the standard of education is not lower than the general standard laid down or approved by the competent authorities; and (iii) That attendance at such schools is optional.Footnote 11

In this context, it is also necessary to look into the UN Convention on the Rights of the Child (CRC), adopted in 1998 and in force in an extraordinary number of 196 states.Footnote 12 Article 30 of the CRC, following Articles 28 and 29 on education, follows the formulation of Article 27 of the ICCPR so children belonging to a minority ‘shall not be denied the right, in community with other members of his or her group, to enjoy his or her own culture, to profess and practise his or her own religion, or to use his or her own language’.

Finally, among other relevant UN instruments, we have to mention the International Convention on the Elimination of All Forms of Racial Discrimination, 1966 (CERD).Footnote 13 This refers to ‘the right to education and training’ as among the rights to be protected against discrimination (Art. 5 (e)v).Footnote 14

CERD provides for the jurisdiction of the International Court of Justice (ICJ), and it was used by Ukraine to bring a case against the Russian Federation concerning, among other issues, education in Ukrainian and Crimean Tatar languages on the occupied territory of Crimea.Footnote 15 Ukraine alleged that the measures taken by the Russian Federation in education since the invasion in 2014 led to a dramatic decline in accessibility and quality of education in these two languages. The Court started from the premise that:

even if Article 5 (e) (v) of CERD does not include a general right to school education in a minority language, the prohibition of racial discrimination under Article 2, paragraph 1 (a), of CERD and the right to education under Article 5 (e) (v), may, under certain circumstances, set limits to changes in the provision of school education in the language of a national or ethnic minority. (para. 354)

The Court then considered that:

Language is often an essential social bond among the members of an ethnic group. Restrictive measures taken by a State party with respect to the use of language may therefore in certain situations manifest a ‘distinction, exclusion, restriction or preference based on … descent, or national or ethnic origin’ within the meaning of Article 1, paragraph 1, of CERD. (para. 355)

Recognising the right of every state to decide on the primary language of instruction in its public schools, the Court warns against a discriminatory adverse effect of such decisions so ‘as to make it unreasonably difficult for members of a national or ethnic group to ensure that their children, as part of their general right to education, do not suffer from unduly burdensome discontinuities in their primary language of instruction’ (para. 357). Finally, the Court established that there was an 80 per cent decline in the accessibility of education in Ukrainian during the first year after 2014 and a further decline of 50 per cent by the following year.Footnote 16 The Russian Federation was not able to provide convincing reasons for such a decline, so the Court concluded that it violated the relevant articles of CERD.Footnote 17

In conclusion, it seems that there has been certain progress at the level of the UN in recognising education for members of linguistic minorities. However, the majority of the analysed instruments either limits its scope to the non-discrimination obligation or imposes certain limitations and requirements on such education.Footnote 18

19.3 Council of Europe: General Human Rights versus Special Treaties

The protection of national minorities and their languages is among the core activities of the Council of Europe as part of the protection and promotion of human rights. Today, the Council of Europe has forty-six Member States after expelling the Russian Federation in March 2022 owing to its aggression against and invasion of Ukraine.Footnote 19 All forty-six Member States are bound by the European Convention for the Protection of Human Rights and Fundamental Freedoms, the Council’s flagship treaty, and the jurisdiction of the European Court of Human Rights (ECtHR).Footnote 20 However, the Convention has a somewhat limited scope with regard to minorities’ rights and freedoms.Footnote 21

Post-Second World War Europe has been a champion in many aspects of the protection and promotion of human rights. Case in point, the European Convention for the Protection of Human Rights and Fundamental Freedoms (ECHR) was already adopted in 1950, only two years after the adoption of the Universal Declaration on Human Rights by the General Assembly of the UN.Footnote 22 Nevertheless, the ECHR and consequently the case law of the ECtHR has limited application in the field of the protection of minority rights and education in minority languages.

The right to education is protected by Article 2 of Protocol No. 1 to the Convention. In 1968, the ECtHR ruled in the Belgian Linguistic case that states have no obligation to ensure education in one’s own language.Footnote 23 The right to education implied the right to be educated in the national language (in public schools).

This firm position was seemingly challenged in the interstate case between Cyprus and Turkey before the Grand Chamber in 2001.Footnote 24 The case concerned Greek-language education in the occupied part of Cyprus where the Turkish Republic of Northern Cyprus (TRNC) was in power. Primary education was available but if children wanted ‘to pursue a secondary education through the medium of the Greek language’ they were obliged to transfer to schools in the south or to continue education at a Turkish or English-language school in the north. The Court admits that:

In the strict sense, accordingly, there is no denial of the right to education, which is the primary obligation devolving on a Contracting Party under the first sentence of Article 2 of Protocol No. 1. Moreover, this provision does not specify the language in which education must be conducted in order that the right to education be respected (see the above-mentioned Belgian linguistic judgment, pp. 30–31, § 3).Footnote 25

However, and here comes the twist:

the option available to Greek-Cypriot parents to continue their children’s education in the north is unrealistic in view of the fact that the children in question have already received their primary education in a Greek-Cypriot school there. The authorities must no doubt be aware that it is the wish of Greek-Cypriot parents that the schooling of their children be completed through the medium of the Greek language… [T]he failure of the “TRNC” authorities to make continuing provision for it at the secondary-school level must be considered in effect to be a denial of the substance of the right at issue. (para. 278)

Finally, the Court concluded ‘that there has been a violation of Article 2 of Protocol No. 1 in respect of Greek Cypriots living in Northern Cyprus in so far as no appropriate secondary-school facilities were available to them’ (para. 280).

More recently, the ECtHR adopted two judgments concerning schools in a minority language (Russian) in Latvia. Both cases concerned legislative amendments from 2018 increasing the proportion of subjects to be taught in the state language (Latvian) in public and private schools, so the use of Russian as the language of instruction was consequently being reduced.

The first case, Valiullina and Others v. Latvia,Footnote 26 concerned the use of Latvian in public schools. The Court returned to its conclusions reached in the Belgian Linguistic case that Article 2 of Protocol No. 1 does not include the linguistic preferences of the parents and explained that this judgment, as well as the one in the Cyprus v. Turkey case, and some others, dealt with education in (one of) the national language(s).Footnote 27 Accordingly, ‘the right enshrined by Article 2 of Protocol No. 1 does not include the right to access education in a particular language; it guarantees the right to education in one of the national languages or, in other words, official languages of the country concerned’ (para. 135).

In Džibuti and Others v. Latvia,Footnote 28 the Court examined the complaints by Russian speakers concerning the effect of the same 2018 legislation reform with respect to private schools. Prior to the reform, education in the state language had been mandatory only in public schools, but the authorities claimed that private schools form part of the state education system and should therefore respect the need to have a certain proportion of classes in the national language.Footnote 29 The Court would not accept that Article 2 of Protocol No. 1 covers education in minority languages and concluded that the claim must be rejected as incompatible ratione materiae with the provisions of the Convention.

In both Latvian cases, the Court also dismissed allegations of discrimination and concluded that there was no violation of Article 14 taken together with Article 2 of Protocol No. 1.Footnote 30 Furthermore, the applicants tried to rely on the Framework Convention for the Protection of National Minorities (FCNM).Footnote 31 This convention refers to education in several articles, but the Court ‘emphasized that the principle of instruction in one’s mother tongue, which the applicants referred to, is far from being the rule among the Member States of the Council of Europe’.Footnote 32 Ironically, this conclusion was extended with praise for Latvia for ensuring the protection of minority rights in their Constitution.Footnote 33

Almost simultaneously, in October 2023, the Advisory Committee (AC) of the FCNM adopted their fourth monitoring report on Latvia. The AC adopted a very stark warning about the second education reform legislation following the one from 2018 adopted in 2022 that:

will result in the phasing out of teaching in minority languages in most public and private preschools, schools and universities by 2025. [T]he termination of teaching in Belarusian and Russian will affect about 20% of all children of schooling age. With plans also underway to discontinue the teaching of Russian as a foreign language, the offer will be reduced to extracurricular courses of language and culture. Should all these measures be implemented as planned, Latvia’s system of minority education will no longer comply with the Framework Convention’s provisions regarding equal access to education, the right to set up private minority educational establishments, and the right to being taught the minority language or for receiving instruction in this language.Footnote 34

With these two judgments, the Court further confirmed that the ECHR has very limited application concerning minority rights and that their protection is better ensured through the treaties especially dedicated to the protection of minorities, namely, the FCNM and the European Charter for Regional or Minority Languages (ECRML).Footnote 35 However, these treaties apply only to state parties. So far, the Court has refused to apply some of their principles as customary law or general principles of international law extending their application beyond state parties.Footnote 36

The ECRML has not been invoked in these two cases, since Latvia is not a state party to it. In general, the ECRML is hardly ever discussed by the Court. In one of the more recent cases, the Court simply established that the ECRML expressly recognises that the protection and encouragement of minority languages should not be to the detriment of official languages and the need to learn them.Footnote 37

Nevertheless, the two treaties of the Council of Europe dealing specifically with national minorities (FCNM and ECRML), in force since 1998, have their own peculiarities. The former has thirty-eight state parties,Footnote 38 and the latter only twenty-five. They both have specific monitoring mechanisms that include the Committee of Ministers of the Council of Europe. The Framework Convention monitoring body is the AC and the Charter has the Committee of Experts.Footnote 39 The system is based on recommendations by all three bodies. Both special instruments of the Council of Europe dedicated to the protection of national minorities provide for the need and indeed the obligation to offer education in minority languages. The state parties of the Framework Convention ‘undertake to promote equal opportunities for access to education at all levels for persons belonging to national minorities’ (Art. 12.3). The Charter is even more precise as one of its objectives includes ‘the provision of appropriate forms and means for the teaching and study of regional or minority languages at all appropriate levels’ (Art. 7.1.f). In addition, state parties may choose to provide education at all levels or a substantial part in the selected languages. Finally, the last option is to provide for the teaching of the relevant regional or minority languages (Art. 8).Footnote 40

19.4 COVID-19 Pandemic and Education

During the COVID-19 pandemic, both monitoring mechanisms issued statements and recommendations to state parties that were also valid for a wider circle of states. The statements identified and focused on specific problems that minorities faced during the COVID-19 pandemic and that the majority population was not. Furthermore, as part of their monitoring processes, both bodies now routinely ask questions about the measures taken during the pandemic that affected the national minorities and the use of their languages.

In addition, a newly established intergovernmental body within the Council of Europe called the Committee on Anti-discrimination, Diversity and Inclusion (CDADI) specifically carried out an investigation to shed light on this worrying problem.Footnote 41 It seems that education in minority languages has been most affected.

From the start of the pandemic, the Committee of Experts as the monitoring body included the question of the measures taken during COVID-19. The pandemic had a terrible impact on all aspects of our lives – social, mental, financial, political, educational, and, of course, it affected our health. It was often heard that the virus does not discriminate; that we are all in it together. Nevertheless, research shows that this is not necessarily true and that there are some social groups that were more affected than others.

These social groups may be called vulnerable, and they represent different kinds of minority populations – mostly ethnic, racial, national, or sexual minorities. Evidence shows that these groups suffered the consequences of the pandemic disproportionally to their share in the population. The consequences ranged from a lack of access to sanitary information and vaccines to exposure to hate speech and physical violence. The latter especially affected LGBTQI people but also Asian-looking people at the beginning of the pandemic, as they were perceived as responsible for the virus. Roma and travellers also suffered from hate speech, and a trend known as anti-gypsism was exacerbated during this period.

These trends were confirmed, for example, by the European Network of Equality Bodies (EQUINET) and ombudsmen across Europe. They noted increases in xenophobia and racism against pretty much every minority group. The people responsible for leadership – politicians at different levels – were marked as those who often incited hate speech and discrimination.Footnote 42

The usual targets, at least in Europe, where such trends are concerned are the Roma, a community that were already marginalised and discriminated against even before the start of the pandemic. According to a study by the Council of Europe for the CDADI,Footnote 43 in addition to general measures that affected the general population, Roma were targeted by some measures directly discriminating against them. For example, in Bulgaria and Slovakia there were militarised quarantines imposed on Roma settlements as, allegedly, they were the people who would not respect the sanitary restrictions. In Bulgaria, they were even using drones with thermal sensors to take the temperature of the residents of these settlements and to monitor their movements.

Restrictions on free movement during the pandemic affected the Roma also in a more indirect form. Their way of life and making a living often includes travelling and collecting materials for sale. That was obviously not possible during the lockdowns, but they were not entitled to financial assistance for losing their jobs because, technically, they did not lose their jobs as they did not have jobs to lose. A real Catch-22.

Finally, there is data in many states that minorities, including the Roma, suffered more infections and COVID-19 related illnesses. Namely, belonging to a minority ethnic group was an additional health risk factor. As far as the Roma were concerned, most of them did not have any health insurance and were not entitled to free health services, so many stayed away from hospitals for as long as they could.

Let us think back to those first few weeks of the pandemic. We were receiving hourly and daily information and instructions on how to behave and what measures we had to take to stay safe. How much of it was given in minority languages, in real time? According to research carried out by the Committee of Experts for regional or minority languages, which I am a member of, many states failed to provide early information in regional or minority languages. Put that in the context of the non-available sanitary facilities in Roma settlements, for example, then add the lack of sanitary material such as disinfectants or facial masks together with big families living together in small houses, and the result was quite devastating for these communities. In the early days of the pandemic, in March 2020, the Committee of Experts warned about the need to use regional or minority languages in providing information to the population about the sanitary measures required to prevent the spread of the disease.Footnote 44 In the early days of the COVID-19 pandemic, people were confused and scared and the lack of proper information made it even worse. The Committee of Experts invited states to take language-related issues into account when developing further policies and instructions to address this exceptional medical crisis. In its view, the authorities should not forget that national minorities are an integral part of their societies, and in order for the measures adopted to have full effect, they should be made available and easily accessible to the whole population.

It is very common to say – well, they all speak the official language. However, this is not always true. There are parts of societies in some countries where a minority language is so dominant that many people do not master the official language of the state. This is especially the case with the Roma, whose social isolation also results in poor language skills in the official language.

A positive example in this context comes from Croatia, where the Institute for Public Health published sanitary instructions in Romani on their webpage in April 2020 along with other minority languages.Footnote 45 There are, however, at least two problems with this commendable example: (a) How many Roma had access to the internet? (b) Romani is not the only language spoken by Roma in Croatia, as the majority speaks Boyash Romanian, a completely different language. However, the Red Cross organisation in one of the counties where the Roma community in Croatia lives distributed sanitary packages with instructions in Croatian and Boyash.Footnote 46

As far as education is concerned, at the start of the pandemic the first measure across the world was the lockdown affecting all aspects of public life, including schools at all levels of education. According to UNESCO data, 1.6 billion learners were affected by school closures in more than 190 countries.Footnote 47 Therefore, realising the importance of education, one of the first measures undertaken by public authorities to counter the lockdown and the closing of schools was the shift to online education and/or TV classes.Footnote 48 This was a quick fix for an unprecedented situation.Footnote 49 However, the Committee of Experts realised the downsides of online education that quickly became a dominant model of education with respect to regional or minority languages. Many states took care of the education in the majority official language only, neglecting sometimes the needs of the speakers of regional or minority languages to also access quality education. The Committee of Experts made a public statement in July 2020 highlighting this potentially discriminatory treatment.Footnote 50 In addition, and bearing in mind the insufficient availability of teaching materials in regional or minority languages noticed during several monitoring cycles in many states, the Committee of Experts encouraged public financing of the development of quality open access textbooks in all languages protected under the Charter. Such textbooks, registered under open licences, should be made accessible online for the use of pupils, students, teachers, and the larger public. This was done for the majority languages, so the same standard was necessary also for regional or minority languages.

In view of the AC under the Framework Convention, the suspension of classes in schools and pre-school education during the COVID-19 pandemic has regrettably often resulted in unequal access to education and discrimination of children belonging to national minorities, particularly those who were not proficient enough in the official languages to be provided with appropriate educational content. As a result, children of national minorities were at risk of learning delays and dropping out.Footnote 51

Although some of these inequalities were justified by the sudden onset of the pandemic and the need to deal with such an unforeseen public health situation, it is still not acceptable to treat members of ethnic or national and linguistic minorities as second-class citizens. Furthermore, it is also clear that some aspects of COVID-19-style education will remain, and it is therefore important to be aware of the needs of the members of national and linguistic minorities.

Let us pause here for another best practice example from Croatia. The Ministry of Education realised the problem of Roma children’s inability to follow classes online or on television. The classes were in Croatian since Croatia did not provide for education in Romani or Boyash. In fact, the school authorities are obliged to give Roma children enhanced teaching of Croatian to improve their language skills so that they can follow classes in Croatian.Footnote 52 Living in settlements meant that the vast majority of children had no access to the internet; television sets, if available, were not at the disposal of the children or, even more often, had to be shared by many children in the household. The Ministry of Science and Education provided tablets and vouchers for the internet for all children who could not afford it otherwise in order to be able to follow the online classes. Most of them were Roma. However, some comments suggested that there was no real control or assistance for these families so the vouchers were often misused. Almost 60 per cent of Roma pupils barely participated in online education. Among the reasons given, lack of support by the family and inadequate conditions at home were predominant, but lack of tablets or no access to the internet were high on the list.Footnote 53 It is clear that the consequences will be difficult to remedy since Roma children have a very high drop-out rate from school even at the best of times.Footnote 54

In Slovakia, on 28 April 2020, the Ministry of Education issued the first guidelines on the content and organisation of education for primary school pupils during that time and released material by the State Pedagogical Institute entitled ‘Content of education at primary schools during the extraordinary interruption of teaching at schools’ including minority languages: German, Hungarian, Romani, Russian, Ruthenian, and Ukrainian. However, the Committee of Experts noted that, as in other state parties, there were significantly more alternative audiovisual education materials available in the official language from official and unofficial sources.Footnote 55

In the UK, the authorities undertook a number of measures considering the needs of regional or minority languages, in particular when it was the remit of regional authorities.Footnote 56 The Welsh Government has taken measures in education by ensuring that children and parents/carers received support during school closures and their reopening. The Scottish Government, through Bòrd na Gàidhlig, also provided funding for resources to help students with distance learning in Scottish Gaelic during the closure of schools. Many resources have been produced to help with distance learning. Additionally, the BBC published resources for ‘lockdown learning’ in Irish, Scottish Gaelic, and Welsh.

In Austria, a study was carried out on the perception of a possible decrease in knowledge of the languages of instruction (Slovene and German) in three minority secondary schools in Carinthia. After the overnight shift to online teaching, one favourable fact was that the majority of pupils were from economically solid backgrounds and were able to follow online lessons. Nevertheless, there seems to be a decline in knowledge of both languages in monolingual speakers owing to a lack of contact with the languages at school.Footnote 57

19.5 Conclusions

The right to education in minority languages has been challenged recently in the proceedings of several international courts with not so satisfying results for minority language speakers. The arguments accepted by the courts were limited to non-discrimination. However, wider arguments should have been propagated in favour of the protection and promotion of minority languages in education. While not every language can be catered for in the school system, states should be persuaded that they have more positive obligations to preserve and promote languages spoken by parts of their population.

The COVID-19 pandemic has shown just how easy it is to fall into discriminatory patterns in the treatment of minorities and how easy it is to find excuses for this. Instead, the authorities should always keep in mind their responsibility to manage social trends and correct those with negative potential.

Council of Europe treaties specifically dedicated to the protection of minorities and their languages, in particular the ECRML, seem to be better suited to promoting a different approach to education at least for their state parties. It makes sense, therefore, to promote their ratification even more.

In the words of the former UN Special Rapporteur Mr de Varennes:

Minority language education is more of an asset than it is a liability when understood and applied properly. It is not a right which is applicable in every situation where one individual simply demands it. It is rather a result which is consistent with the values of respect for diversity, tolerance and accommodation – rather than rejection – of differences which have become cornerstones of most democratic societies.Footnote 58

20 Technological Acceleration and the Precarisation of Work Reflections on Social Justice, the Right to Life, and Environmental Education

20.1 Introduction

This chapter aims to provide the reader with a comprehensive overview of contemporary discussions surrounding labour and human rights in the context of technological acceleration using the context of Brazil. In times of profound digital transformation, labour relations have been shaped by growing platformisation, precarisation, and inequality, demanding a critical analysis of the new forms of exploitation and the social and political responses to these challenges.

The chapter is structured around a bibliographical review of authors who address the intersection of labour, technology, and human rights, including Rafael Grohmann, Ricardo Antunes, Ludmila Costhek Abílio, Trebor Scholz, Paul Singer, and Renato Dagnino. These scholars provide a solid foundation for understanding how platformisation processes and algorithmic management have redefined labour dynamics, exacerbating social and economic inequalities. In particular, it discusses how the gig economy and the ‘uberisation’ of work have transformed labour relations, increasing worker vulnerability while reducing their protections and rights.

Throughout the chapter, we investigate how the regulation of digital work and the recognition of labour rights, especially on digital platforms, have become topics of debate in Brazil, particularly in a scenario of increasing automation and social exclusion. We analyse initiatives such as Fairwork, which has emerged from the fight to promote decent working conditions in the context of digital platforms. Additionally, we highlight the difficulties faced by workers involved in this digital economy, such as the lack of security, income instability, and the opaque control exercised by companies through algorithmic management.

Finally, we bring environmental education to these discussions, showing how it can significantly contribute to addressing the socio-economic and environmental inequalities associated with platform capitalism. Environmental education, especially from the perspective of environmental justice, offers a critical lens that not only addresses ecological issues but also reflects on the social and political conditions that affect the most vulnerable workers. The connection between education, human rights, and technology thus becomes an important avenue for building more just socio-environmental relations.

20.2 Contextualising the Discussion: Digital Transformations and Labour Inequalities in Brazil

In recent years, the rapid advancement of digital technologies has profoundly transformed labour markets worldwide. As Brazil navigates this wave of digitalisation, a critical issue arises: that of the inequalities faced by vulnerable workers in this evolving landscape. Although digitalisation offers remarkable opportunities for efficiency and innovation, it also introduces new forms of disparity and exclusion. Vulnerable workers – often those in low-paying jobs, informal sectors, or with limited access to technology – are particularly susceptible to the adverse effects of this transformation, which directly impacts their well-being and economic stability.

Globally, and especially in Brazil, the intensification of labour exploitation has been used as a measure to revitalise and stabilise capitalist accumulation.Footnote 1 According to Neves, processes of precarisation, outsourcing, and informal labour are essential for the expansion of capitalism. The shift in the labour organisation model, which makes it increasingly flexible, is strongly marked by the platformisation of work.Footnote 2 The accelerated advancement of digital technologies and the growing automation of production processes have generated profound transformations in the world of work, while simultaneously accentuating socio-economic inequalities.

In this context, workers face the risk of alienation and exclusion as technological development advances at an ever-increasing pace without adequate social protection mechanisms and adaptation to the new labour realities. According to the Economic Commission for Latin America and the Caribbean, technological progress has been accompanied by a series of socially negative outcomes, such as the exclusion of a significant portion of the population from the benefits of digitalisation, mainly owing to insufficient incomes that limit access to quality connectivity, suitable devices, and reliable domestic connections. Additional problems include the proliferation of fake news, the increase in cyberattacks, growing privacy risks, and the issue of electronic waste.Footnote 3

The COVID-19 pandemic exacerbated these issues, bringing negative impacts on jobs, wages, and efforts to combat poverty and inequality, especially in countries such as Brazil, where structural constraints, such as connectivity problems and social inequalities, further limit the benefits of digital technologies.Footnote 4 While common elements can be identified in the digitalisation process across different countries, it is crucial to recognise the particularities of each location and region. The social dynamics and inequalities that characterise each context are accentuated by digitalisation, which often does not allow collective struggles to take shape or rights to be strengthened.

Rafael Grohmann, in his book Os Laboratório do Trabalho Digital (The Digital Labor Laboratories), argues that contextualising the geopolitics of platform labour also means understanding the different meanings of work in the economies of each country, distinguishing experiences between the global North and South.Footnote 5 Grohmann and Abílio et al. highlight that while the term ‘gig economy’ emerged in the global North to describe the platform work landscape, in Brazil, this nomenclature does not apply in the same way, as the Brazilian economy has always been characterised by a management of survival for the working class, now intensified by the transition to the digital under a liberal rationality.Footnote 6 Thus, platform-mediated subordinate labour is embedded in contemporary dilemmas related to mapping and recognising worker exploitation and its centrality in current forms of capitalist accumulation.Footnote 7

Ricardo Antunes, one of Brazil’s most prominent labour sociologists, notes that before 2020, more than 40 per cent of the Brazilian working class was in informal situations, a situation that worsened further with the COVID-19 pandemic.Footnote 8 According to him, ‘we are living in a new level of real subordination of labour to capital under algorithmic governance, with the working class living between the disastrous and the unpredictable’.Footnote 9 In our view, this scenario reinforces Grohmann’s analysis of the gig economy where workers, placed in precarious and unstable conditions, struggle to secure only the bare minimum for their survival.Footnote 10 These workers, constantly pressured by low wages and volatile conditions, cannot surpass the subsistence barrier, leaving them with only the effort to cover basic expenses, without the possibility of reaching an income that provides any kind of stability or economic progress.

Despite the over-exploitation of labour being a constant in Brazil and Latin America, it is evident that technological advances are transforming the ways in which the working class faces precarisation and exploitation.Footnote 11 In this context, Fairwork emerges as a relevant initiative both in Brazil and globally.Footnote 12 This project, based at the Oxford Internet Institute and the WZB Berlin Social Science Centre, works closely with workers, platforms, lawyers, and legislators in various parts of the world to think about and develop a fairer future for work. The Brazilian team is co-ordinated by professors Rafael Grohmann, Julice Salvagni, Roseli Figaro, and Rodrigo Carelli. Additionally, we highlight the efforts of researchers, such as Ludmila Costhek Abílio, Abílio et al., Grohmann, Rebechi et al., and many others who are fighting for the recognition and defence of the labour rights of workers within the context of digital platforms.Footnote 13

In 2023, the second Fairwork Brazil report was published, continuing to examine how the major digital labour platforms in the country align with Fairwork’s decent work principles amid intense disputes and debates about platform labour regulation.Footnote 14 The document highlights that after the election of Luiz Inácio Lula da Silva for his third term as president of Brazil, a working group was established to discuss the regulation of platform labour in the country, involving the participation of companies, workers, and government representatives. Another important fact concerns how digital platforms use lobbying practices to influence legislation and public policies, often using subtle tactics and data manipulation to distort and contest notions of decent work.

In this regard, the phenomenon of uberisation exemplifies the adverse conditions faced by digital platform workers in Brazil, as described by Abílio and Santiago in the ‘Dossiê das violações dos Direitos Humanos no Trabalho Uberizado’ (‘Dossier on Human Rights Violations in Uberised Work’).Footnote 15 Uberisation, as the dossier defines it, refers to the growing precarisation of labour relations promoted by digital platforms such as Uber and iFood. In Brazil, workers face structural problems, including the lack of basic labour rights, unsafe working conditions, and the absence of the formal recognition of their activities as regular employment. Platform-mediated work has given rise to various terms worldwide and in Brazil that attempt to describe the specific forms of precarisation associated with this reality.

Besides the terms ‘gig economy’ previously mentioned, and ‘uberisation’, we highlight new vocabularies that have been incorporated into research on the world of work. Among them is the concept of the ‘just-in-time worker’,Footnote 16 ‘crowdwork’, and ‘work on demand’.Footnote 17 These terms reflect the different dimensions of the gig economy, which encompasses both crowdwork (work performed through online platforms) and on-demand work managed by apps (‘work on demand via apps’).

Based on De Stefano’s contributions, we understand that crowdwork involves the performance of tasks through online platforms that connect clients and workers globally, ranging from simple microtasks to more complex jobs.Footnote 18 On-demand work via apps includes traditional activities such as transportation and cleaning, managed by apps that set quality standards. While crowdwork has global characteristics and on-demand work responds to local aspects, both share characteristics such as payment and management methods. We know these terms are more complex than presented here, but the goal is not to exhaust the concepts, but rather to highlight the reflections of these practices in the world of work. Additionally, differences between platforms can impact legal issues, such as the validity of contracts and applicable legislation.

Therefore, the problems faced by platform workers include income instability, where earnings vary significantly and often do not cover living costs. Additionally, these workers do not have access to benefits such as health insurance, pensions, or protection against work-related accidents, and they bear the full cost of work tools, such as vehicles and smartphones, exacerbating their financial vulnerability.Footnote 19 Another critical aspect is algorithmic management, which subjects workers to opaque control, meaning there is no transparency in this relationship.Footnote 20 Abílio et al. add that algorithmic management is based on automated instructions that process large volumes of data, influencing both the workers’ daily actions and consumer dynamics.Footnote 21 This work organisation model generates instability and a lack of clarity in the rules.

20.3 Forms of Resisting the Deepening of Platform Capitalism Inequalities

The book Platform Cooperativism: Challenging the Corporate Sharing Economy by Trebor Scholz, translated and commented on in Portuguese by Rafael Zanatta, emphasises that platform capitalism deepens labour precarisation, offering unstable and rights-deprived conditions while concentrating wealth and power in the hands of a few, thereby intensifying economic inequality.Footnote 22 The lack of regulation allows these companies to operate without social responsibility, exploiting workers under the false promise of autonomy and flexibility. Furthermore, this model contributes to the erosion of traditional labour rights, such as paid vacation and health insurance, aggravating de-regulation and worker vulnerability – a point already addressed by other authors throughout this publication. However, Scholz’s main contribution, in our view, lies in his proposal for platform cooperativism, which offers a fairer and more democratic alternative to this exploitation.Footnote 23

Scholz’s proposal opposes the logic of the sharing economy, advocating for the creation of worker-controlled labour platforms, offering a more equitable and democratic alternative to the exploitation inherent in current corporate models.Footnote 24 In this same book, Rafael Zanatta discusses the origins of cooperativism in Brazil, which is linked to the early days of the Republic and the immigration process aimed at replacing slave labour and adapting to urbanisation and changes in productive structures. However, the dimension of cooperativism in Brazil followed a business logic, only giving way to more solidarity-based proposals during the Lula administration (2002–10) with the creation of the National Secretariat for Solidarity Economy within the Ministry of Labour and Employment.

In this context, it is essential to highlight the importance of the solidarity economy and social technologies, which have been gaining strength in Brazil since the 1980s and 1990s. These initiatives, whose names are associated with Paul Singer and Renato Dagnino, aim to promote solidarity-based and democratic forms of labour organisation, serving as resistance to capitalist exploitation methods. As previously mentioned, Brazil faces a legacy of social inequality and the exploitation of the working class since its origins. Therefore, we believe that platform cooperativism shares the same goals as social technology and the solidarity economy.

Social technology emerges as a tool that promotes collaboration, inclusion, and social transformation, designed to meet the specific needs of communities. By characterising technology as ‘social’, we recognise that it is not neutral and that its applications can have varying impacts.Footnote 25 This understanding challenges the traditional view of technology, which often prioritises profit over social and environmental well-being. It is within this context that the movement for social technology arises, which, according to Dagnino, constitutes a rejection of conventional technology, seeking alternatives that prioritise the collective and sustainability.Footnote 26

Paul Singer, in turn, argues that the solidarity economy presents itself as an alternative to the neoliberal model, seeking fairer forms of production and trade.Footnote 27 Singer discusses the solidarity economy as a response to the context of inequality, highlighting its capacity to generate income and empower communities.Footnote 28 Singer argues that ‘there is no way to ignore that the solidarity economy is an integral part of the capitalist social formation, in which capital concentration incorporates technical progress and thus determines the conditions of competitiveness in each market’.Footnote 29

Singer adds that the formation of cooperatives or cooperative complexes reveals an organisational strategy aimed at strengthening cooperativism in the face of capitalist dynamics.Footnote 30 In a scenario where capital is concentrated in the hands of a few, technological advances tend to favour this concentration, resulting in growing inequalities. Therefore, while the solidarity economy tries to mitigate the negative effects of capital concentration, it is also influenced by these conditions, revealing the interdependence between the two systems.

In this context, platform cooperativism emerges as an alternative that enhances this logic of collaboration and coordination among cooperatives. Social technology aligns with this perspective of platform cooperativism by proposing technological alternatives to facilitate collaboration between cooperatives and their members, promoting an organisational model that values community participation and autonomy. These platforms not only offer tools for management and commercialisation but also foster the exchange of knowledge and experiences, essential for strengthening cooperativism and the solidarity economy. Alvear et al. argue that:

Among the numerous difficulties faced by cooperatives and solidarity economy enterprises, one of them is access to technology, particularly technologies that are suited to their organizational structures and values. Authors such as Dagnino (2004; 2019) and Varanda and Bocayuva (2009) emphasize how conventional technologies reinforce capitalist values and organizational forms, and thus, Social Technology would be the appropriate technology for solidarity enterprises.Footnote 31

Social technology and the solidarity economy, when integrated into platform cooperativism, can help build more robust networks where cooperatives from different sectors can come together and develop solutions adapted to their local realities. This is especially relevant in contexts of vulnerability, where communities need support to overcome economic and social challenges. All these theoretical and methodological efforts share a common denominator – building foundations to achieve decent work.

20.4 Human Rights and Worker Protection in the Era of Technological Acceleration

The concept of fair and decent work has its roots in the labour struggles of the twentieth century, formally defined in 1999 by the International Labour Organization (ILO). Even in the twenty-first century, this remains a central demand within the United Nations (UN) 2030 Agenda, as part of the Sustainable Development Goals (SDGs).Footnote 32 The UN recognises that decent work is essential for eradicating poverty and promoting prosperity. This concept encompasses working conditions that ensure fundamental rights, social protection, and equal opportunities. SDG 8 emphasises the importance of decent work and sustainable economic growth, acknowledging that promoting proper working conditions is essential not only for eradicating poverty but also for fostering prosperity and social well-being.

Achieving this goal involves ensuring labour rights, combating unemployment, promoting job security, and encouraging social dialogue. In an increasingly globalised and constantly changing world, the challenge of ensuring fair and dignified working conditions becomes even more relevant, requiring collective efforts from governments, businesses, and civil society. Amid these changes, human rights emerge as a crucial anchor for safeguarding the dignity and working conditions of millions of people around the world.

The right to work, enshrined in Article 23 of the Universal Declaration of Human Rights (UDHR), constitutes a central principle to ensure that, even in times of intense technological transformation, everyone can have access to dignified employment opportunities. The growing automation of jobs – especially in industries and service sectors – puts millions of jobs at risk. This results in a paradox between technological progress and increasing precarisation of work.

Hartmut Rosa, in Alienation and Acceleration: Towards a Critical Theory of Late-Modern Temporality, describes ‘time compression’ and ‘technical expansion’ as central features of a world driven by the imperatives of growth and speed. As the economy accelerates, technology not only transforms production dynamics but also redefines social relations and the experience of time and space.Footnote 33 Rosa argues that we are living in a ‘late modernity,’ marked by a process of social acceleration in three dimensions: ‘technological acceleration, acceleration of social changes, and acceleration of the pace of life’.Footnote 34

According to Rosa, technological acceleration ‘constantly displaces the spaces of security’ that were once guaranteed by stable jobs and continuous careers, creating new forms of insecurity and alienation.Footnote 35 This acceleration intensifies the pressures on workers, who face both the insecurity of losing their jobs to machines and algorithms and the difficulty of adapting their skills to new contexts.Footnote 36 In this sense, it is crucial to ensure that workers have not only access to jobs but also fair and equitable working conditions, along with opportunities for reskilling and professional development.

The ILO advocates that the protection of fundamental labour rights must include job security and access to decent working conditions. Technological acceleration, while bringing innovation, also exacerbates social inequalities. As Rosa points out, the dynamics of acceleration tend to benefit those already in privileged positions, widening the gap between the rich and the poor.

This issue is reflected in the Fairwork Report in ‘Life Stories’, where we see workers such as João.Footnote 37 João’s story clearly illustrates that, in scenarios of labour precarisation, such as those he faces, the principles of Article 23 of the UDHR, which establishes the right to decent work, are violated. This, in turn, impedes the fulfilment of what is guaranteed by Article 25, which ensures an adequate standard of living. In light of this scenario, human rights, such as the right to a safe working environment, fair wages, and protection against unemployment, must be reaffirmed in the contexts of digitalised labour. The regulation of platforms and the inclusion of informal workers in social security networks are necessary measures to combat exploitation and ensure that technology is used to promote social well-being, rather than deepening inequalities.

20.5 Environmental Education as a Response to Inequalities and the Defence of Human Rights

Environmental education, according to Reigota, emerges as a response to the need to address the environmental problems generated by the capitalist economic model, which is predatory and unsustainable.Footnote 38 The starting point for the environmental discussion occurred at the First World Conference on the Human Environment, held in Stockholm, Sweden, in 1972. This meeting resulted in agreements between the UN signatory countries, emphasising the importance of educating people to solve environmental issues. From this conference onwards, global environmental concern gained prominence, followed by other significant events, such as the Belgrade Conference (1975), Tbilisi (1977), Moscow (1987), Rio (1992), and Rio+10 (2002) in Johannesburg, all of which contributed to the implementation of public policies on environmental education at the international level.

The concerns that gave rise to environmental education were primarily conservationist in nature and often resembled a ‘manual of etiquette’,Footnote 39 with proposals that were more behavioural than critical of the capitalist system. Initially, environmental education was a concern of ecological movements; however, as the debate deepened, works by authors such as Layrargues and Carvalho became essential in expanding the field’s discourse.Footnote 40

In this regard, it is worth noting that environmental education is a constantly evolving field, shaped by socio-environmental issues: ‘Refounding the historical, anthropological, philosophical, sociological, ethical, and epistemological foundations of Environmental Education means providing new representations to the signs that these sciences come to symbolise within the horizon of a plurality of knowledge within a unity of meanings.’Footnote 41 Considering that we live in a time when crises seem to overlap at a frenetic pace – something we can call polycrises, as reported by Pinheiro and Pasquier – we understand that the necessary debate in the field of environmental education is one that seeks to comprehend the conditions surrounding the emergence of epidemics and pandemics, particularly COVID-19, climate catastrophes, and wars spreading across different parts of the planet.Footnote 42 In all cases, there is always a segment of society disproportionately affected by the damages and negative consequences of these processes.

From this perspective, Isabelle Stengers, in her book In Catastrophic Times, discusses the relationship between the economic crisis in the US and the devastation caused by Hurricane Katrina.Footnote 43 The author asserts that economic and climate crises share a common denominator. Similarly, Henri Acselrad argues that the COVID-19 pandemic, which emerged in 2020, cannot be understood in isolation, but rather as an intrinsic product of neoliberal capitalism.Footnote 44 The health crisis emerged in a context already marked by impending financial crisis, resulting in a general collapse of economic activities. For Acselrad, the notions of environmental crisis and disaster must be analysed in light of the processes of capitalist reproduction and crisis.

Carvalho and Ortega support Stengers and Acselrad by pointing to the intertwining of the pandemic, environmental issues, and the climate crisis.Footnote 45 In the same paragraph, the authors reflect on the war between Russia and Ukraine. Therefore, the pandemic, environmental issues, and geopolitical tensions are deeply intertwined, revealing an interconnected global system in which crises not only accumulate but mutually intensify.

Based on Abílio et al., Grohmann, and the Fairwork Report, we add that platform capitalism is an emergent factor within these crises, interconnected within a complex global system in which each crisis amplifies the others.Footnote 46 This reality, described as a polycrisis, demands a relational approach, which has already been addressed and to which we aim to contribute through the field of environmental education. From the perspective already emphasised, environmental education must not only address environmental degradation but also the social and political conditions that contribute to these emergencies. Therefore, environmental education must include a critical analysis – based on critical thinking – of epidemics, climate disasters, and conflicts, recognising that their consequences disproportionately affect vulnerable groups.Footnote 47 This is an environmental education for environmental justice.Footnote 48

Environmental education provides a conducive space for strengthening the fight against the socio-economic and environmental inequalities faced by workers, especially in the context of environmental crises and technological acceleration. By expanding its boundaries beyond environmental preservation, environmental education becomes a field of study aimed at promoting educational projects in which individuals are engaged in the fight for life in its entirety.

In this sense, Carlos Frederico Loureiro, in his book Environmental Education: Questions of Life, places life at the centre of the debate, highlighting the urgency of a utopia that allows the overcoming of the limiting situations imposed by an exclusionary, oppressive system that destroys nature, including humans.Footnote 49 Loureiro’s view of environmental education is anchored in a broad understanding of the struggle for life. For him, this struggle is not limited to the environmental field but involves transforming the social structures that perpetuate inequalities and the exploitation of workers. He emphasises the need for hope and the imagination of other possible worlds, resisting the logic of a system that generates human misery and environmental destruction.Footnote 50

This conception resonates with discussions on the centrality of life as a fundamental right within the framework of human rights. Rossane Bigliardi and Ricardo Cruz, in their article, reinforce this view by stating that the right to life, beyond mere biological existence, includes access to the minimum conditions necessary for the constitution of a healthy and dignified subjectivity, allowing human beings to fully and equitably develop their potential.Footnote 51 In this sense, environmental education aligns with human rights education, promoting a praxis oriented towards justice, equality, solidarity, and freedom.

Amorim et al. provide another key reading for environmental education by discussing the need for environmental educators to reflect on the temporal dynamics of contemporary society.Footnote 52 In their article ‘A resonance of time: the contemporary challenges of environmental education’, the authors point out that contemporary time is marked by an acceleration imposed by neoliberal dynamics, which creates challenges for the full development of humanity. According to the authors, environmental education should engage in a resonance of time, rescuing the importance of educational practices that consider the multiple and complex temporalities of human existence and life on the planet, challenging the utilitarian view of time promoted by a society focused on consumption and productivity.

Moreover, they suggest that environmental education needs to reformulate its foundations, taking into account temporal dynamics and how they affect human and environmental relationships. They propose a critical analysis of social temporalities, articulating individual and collective time, and point to the need for new ‘synchronisers’ that enable formative practices more suited to the complexity of life. This critical reflection on time is also directly connected to the inequalities faced by Brazilian workers in a context of technological hyper-acceleration.Footnote 53

New technologies, by accelerating production rhythms, impose increasing demands on workers, deepen labour precarisation, and exacerbate social inequalities. To address this challenge, environmental education must adopt a perspective that problematises the impact of temporal dynamics on labour relations and society as a whole, promoting a critique of neoliberal models that alienate individuals and fragment their temporal experiences.

Articulating environmental education with human rights is not only possible but also necessary given that both fields share fundamental principles such as human dignity, social justice, and the right to life. Bigliardi and Cruz argue that environmental education, when oriented towards human rights, fosters a civic education that promotes solidarity and cooperation, essential elements for building a more just and sustainable society.Footnote 54 Moreover, this education provides workers with a critical understanding of structural inequalities and prepares them to face the challenges imposed by a system that commodifies life and destroys nature.

By incorporating human rights principles and placing life at the centre of the debate, environmental education becomes a field of reflection capable of building, together with workers, the necessary tools to confront socio-environmental inequalities and participate in constructing a more just society. As Loureiro, Bigliardi, and Cruz, and Amorim et al. argue, this education goes beyond environmental preservation and involves transforming social structures that perpetuate exploitation and the destruction of life in all its forms.Footnote 55 Therefore, environmental education is an education for life and human rights, preparing us for the struggle for a more just and equitable world.

20.6 Conclusion

This chapter has sought to provide a critical analysis of technological transformations and their implications for labour, as in Brazil, particularly for the most vulnerable workers. Throughout the chapter, we have discussed how platformisation, precarisation, and algorithmic management – topics emphasised by authors such as Rafael Grohmann, Ricardo Antunes, and Ludmila Costhek Abílio – are reconfiguring labour dynamics, exacerbating inequalities, and excluding millions of people from basic conditions of dignity at work. The phenomena of the gig economy and the uberisation of work have emerged as symbols of this new landscape, in which workers face financial instability, lack of legal protections, and the invisible control of digital platforms.

The chapter has also highlighted the importance of initiatives such as Fairwork, which aim to regulate platform labour and promote fairer working conditions. Regulation and the recognition of digital workers’ rights are essential in a context where technological acceleration has deepened exploitation, necessitating new forms of protection and worker participation in decisions that affect their lives.

However, the discussion is not limited to formal labour rights. By connecting labour issues with environmental education, this text broadens the reflection to encompass the right to life in its entirety, integrating social, economic, and ecological dimensions. The struggle for environmental and social justice is intimately connected to the fight for decent working conditions, as both involve the right to a full and sustainable life. Environmental education, when grounded in principles such as justice and solidarity, proposes a critical perspective that goes beyond environmental preservation, addressing the roots of the inequalities that perpetuate the exploitation of workers and the destruction of the environment.

In addition, one of the central issues discussed throughout the chapter has been the acceleration of time, a theme explored by Hartmut Rosa. Late modernity is characterised by acceleration in technological, social, and life dimensions, transforming not only labour relations but also workers’ experience of time. This acceleration, driven by the neoliberal logic of productivity, fragments temporal experiences and intensifies pressures on individuals, calling for a response that embraces more human and balanced rhythms. In this sense, environmental education can also be seen as a proposal to reclaim a different rhythm of life, one that considers the complexity of natural, social, and individual temporalities, in contrast to the alienation promoted by technological acceleration.Footnote 56

Consequently, this chapter aims to contribute to discussions about technological transformations in the workplace by proposing an integrated approach that links the right to decent work with the right to life and environmental justice. By recognising that the accelerated pace of contemporary life affects not only work but also human and ecological relations, this chapter suggests that any response to the challenges of digitalisation must include a critical analysis of the foundations of environmental education. Only through a broad perspective that comprehends life in its fullness will it be possible to build alternatives that promote a more just, equitable, and sustainable future.

Footnotes

17 The Digital Divide: Reinforcing Vulnerabilities

1 European Parliament resolution of 13 December 2022 on the digital divide: the social differences created by digitalisation (2022/2810(RSP)), para. D. Also see UN Secretary-General, Roadmap for Digital Cooperation (New York: United Nations, 2020), p. 2.

2 Overview of early discussions is provided in P. K. Yu, ‘Bridging the digital divide: equality in the information age’ (2002) 20 Cardozo Arts and Entertainment Law Journal 1, 1–52, at 2.

3 UN General Assembly Resolution 70/125, Outcome document of the high-level meeting of the General Assembly on the overall review of the implementation of the outcomes of the World Summit on the Information Society, 1 February 2016, para. 21.

4 European Parliament resolution of 13 December 2022 on the digital divide: the social differences created by digitalisation (2022/2810(RSP)), para. H.

5 M. N. Cooper, ‘Inequality in the digital society: why the digital divide deserves all the attention it gets’ (2002) 20 Cardozo Arts and Entertainment Law Journal 1, 73–134, at 73–4.

6 The relationship of the digital divide and social inequality is aptly demonstrated in a recent paper: K. Baraka, ‘Digital divide and social inequality’ (2024) 3 International Journal of Humanity and Social Sciences 3, 30–45.

7 UN Secretary-General, Roadmap for Digital Cooperation, 10.

8 European Parliament resolution of 13 December 2022 on the digital divide: the social differences created by digitalisation (2022/2810(RSP)).

9 OECD, Understanding the Digital Divide (Paris: OECD Publications, 2001), p. 5.

10 This is also highlighted in K. Taylor (reviewed by E. Rasure), ‘The digital divide: what it is and what’s being done to close it’, 28 April 2024, Investopedia, www.investopedia.com/the-digital-divide-5116352.

11 See further, e.g., K. Bagchi, ‘Factors contributing to global digital divide: some empirical results’ (2005) 8 Journal of Global Information Technology Management, 3, 47–65, and Yu, ‘Bridging the digital divide’, 16.

12 V. Rajam, A. Bheemeshaw Reddy, and S. Banerjee, ‘Explaining caste-based digital divide in India’ (2021) 65 Telematics and Informatics, 101719.

13 For a more detailed consideration of geographical and linguistic aspects, see, e.g., R. Cullen, ‘Addressing the digital divide’ (2001) 25 Online Information Review 5, 311–20.

14 H. Milner, ‘The digital divide’ (2006) 39 Comparative Political Studies 2, 176–99.

15 ITU, ‘Facts and Figures 2023. The gender digital divide. Digital gender parity is still a distant prospect in regions with low Internet use’, www.itu.int/itu-d/reports/statistics/2023/10/10/ff23-the-gender-digital-divide/.

17 UN Women, ‘Statement: from clicks to progress – equality in digital access advances rights for young women and girls’, 9 August 2024, www.unwomen.org/en/news-stories/statement/2024/08/statement-from-clicks-to-progress-equality-in-digital-access-advances-rights-for-young-women-and-girls.

18 See further, e.g., A. Antonio and D. Tuffley, ‘The gender digital divide in developing countries’ (2014) 6 Future Internet 4, 673–87; M. Hilbert, ‘Digital gender divide or technologically empowered women in developing countries? A typical case of lies, damned lies and statistics’ (2011) 34 Women’s Studies International Forum 34, 479–89; C. Kularski and S. Moller, ‘The digital divide as a continuation of traditional systems of inequality’ (2012) 5151 Sociology, 1–23.

19 For more detail on this, see Antonio and Tuffley, ‘The gender digital divide in developing countries’.

20 See, e.g., P. Banerjee, ‘Gender digital divide – examining the reality’ (2019) 8 International Journal of Innovative Technology and Exploring Engineering 11S, 214–19.

21 E. Abu-Shanab and N. Al-Jamal, ‘Exploring the gender digital divide in Jordan’ (2015) 19 Gender, Technology and Development 1, 91–113.

22 M. Vimalkumar, J. B. Singh, and S. K. Gouda, ‘Contextualising the relationship between gender and computer self-efficacy: an empirical study from India’ (2021) 58 Information and Management 4, Article 103464.

23 UDHR, GA Res. 217A (III), UN Doc. A/810, at 71 (1948) Art. 12. Also, European Convention on Human Rights, Rome, 4 November 1950, Council of Europe, ETS No. 5, Art. 8; International Covenant on Civil and Political Rights (ICCPR), New York, 16 December 1966, 999 UNTS 171, Art. 17; American Convention on Human Rights, San Jose, 22 November 1969, 1144 UNTS 123, Art. 11; Convention on the Rights of the Child, New York, 20 November 1989, 1577 UNTS 3, Art. 16.

24 See, e.g., S. Mas’udah et al., ‘Gender-based cyber violence: forms, impacts, and strategies to protect women victims’ (2024) 26 Journal of International Women’s Studies 4, Article 5; E. L. Backe, P. Lilleston, and J. McCleary-Sills, ‘Networked individuals, gendered violence: a literature review’ (2018) 5 Violence and Gender 3, 135–46; N. Henry and A. Powell, ‘Technology-facilitated sexual violence: a literature review of empirical research’ (2016) 19 Trauma, Violence and Abuse 2, 195–208.

25 For more on this issue in the Latin American context, see L. Camacho Gutiérrez, ‘Addressing the Digital Divide among students at risk of school dropout in Latin America’ (2024), Global Campus Policy Briefs 2024, https://repository.gchumanrights.org/server/api/core/bitstreams/fc5e90b1-f8c6-428f-8d84-81acf7258080/content.

26 For more on the gender digital divide in education, see, e.g., I. C. Peláez-Sánchez, C. E. G. Reyes, and L. D. Glasserman-Morales, ‘Gender digital divide in education 4.0: a systematic literature review of factors and strategies for inclusion’ (2023) 1 Future in Educational Research 2, 129–46.

27 E.g., women occupy only 22 per cent of all tech roles across European companies: S. Blumberg et al., ‘Women in tech: the best bet to solve Europe’s talent shortage’, 24 January 2023, McKinsey Digital, www.mckinsey.com/capabilities/mckinsey-digital/our-insights/women-in-tech-the-best-bet-to-solve-europes-talent-shortage.

28 UDHR, Art 23, para. 1.

29 As recognised, e.g., in ICCPR, Art. 25, and UDHR, Art. 21.

30 S. Singh, ‘Bridging the gender digital divide in developing countries’ (2017) 11 Journal of Children and Media 2, 245–7.

31 The prohibition of non-discrimination is contained in both general international human rights instruments and specialised treaties such as the Convention on the Elimination of All Forms of Discrimination Against Women (CEDAW), New York, 18 December 1979, 1249 UNTS 13, which mandates state parties to eliminate gender-based disparities in all areas of life.

32 See further, e.g., M. P. Treuthart, ‘Connectivity: the global gender digital divide and its implications for women’s human rights and equality’ (2019) 23 Gonzaga Journal of International Law 1, 1–53.

33 UN Fourth World Conference on Women, Beijing Declaration and Platform for Action, September 1995, www.un.org/womenwatch/daw/beijing/pdf/BDPfA%20E.pdf, strategic objective J.1. and J.2.

34 UN General Assembly, Sustainable Development Goals and targets, 25 September 2015, GA Res. A/RES/70/1.

35 UN General Assembly, Outcome document of the high-level meeting of the General Assembly on the overall review of the implementation of the outcomes of the World Summit on the Information Society, 1 February 2016, GA Res. A/RES/70/125, para. 27.

36 UN Secretary-General, Roadmap for Digital Cooperation.

37 UN Women, ‘Commission on the Status of Women’, www.unwomen.org/en/how-we-work/commission-on-the-status-of-women.

38 CSW, ‘Innovation and technological change, and education in the digital age for achieving gender equality and the empowerment of all women and girls. Agreed conclusions’, 20 March 2023, ECOSOC Resolution E/CN.6/2023/L.3.

39 ITU, ‘Gender and telecommunication policy in developing countries’, 1998, www.itu.int/en/ITU-D/Digital-Inclusion/Women-and-Girls/Documents/Resolutions/WTDC%20Valetta%20Res-7.pdf.

40 E.g., ITU, ‘Mainstreaming a gender perspective in ITU and promotion of gender equality and the empowerment of women through telecommunications/information and communication technologies’, 2018, www.itu.int/en/ITU-D/Digital-Inclusion/Documents/Resolutions/RESOLUTION%2070%20(REV.%20DUBAI,%202018).pdf.

41 Broadband Commission for Sustainable Development, ‘Working Group on the gender digital divide. How can we bridge the gender digital divide?’, https://broadband.itu.int/working-groups/digital-gender-divide-2017/.

42 European Commission, ‘Shaping Europe’s digital future. Women in digital’, https://digital-strategy.ec.europa.eu/en/policies/women-digital.

43 European Commission, ‘Shaping Europe’s digital future. EU countries commit to boost participation of women in digital’, 9 April 2019, https://digital-strategy.ec.europa.eu/en/news/eu-countries-commit-boost-participation-women-digital.

45 European Parliament, the Council and the Commission, European Declaration on Digital Rights and Principles for the Digital Decade (2022), https://ec.europa.eu/newsroom/dae/redirection/document/94370, preamble.

46 European Commission, ‘2030 Digital Compass: the European way for the Digital Decade’, 9 March 2021, COM(2021)118 final.

47 African Union, ‘The digital transformation strategy for Africa (2020–2030)’ https://au.int/sites/default/files/documents/38507-doc-dts-english.pdf, p. 16. The acronym STEAM stands for Science, Technology, Engineering, Art, and Mathematics.

48 ECLAC, Buenos Aires Commitment, LC/CRM.15/6/Rev.1, (Santiago: United Nations, 2023).

49 UN Economic and Social Council, Resolution 2021/28. Assessment of the progress made in the implementation of and follow-up to the outcomes of the World Summit on the Information Society, 22 July 2021, UN Doc. E/RES/2021/28.

50 ITU, ‘Measuring digital development: facts and figures 2024’, www.itu.int/itu-d/reports/statistics/facts-figures-2024/.

51 Digital Economy and Society Index (DESI) 2022, ‘Human capital’, https://ec.europa.eu/newsroom/dae/redirection/document/88765

52 FRA, Fundamental Rights of Older Persons: Ensuring Access to Public Services in Digital Societies (Luxembourg: Publications Office of the European Union, 2023), p. 7.

53 For more on the age-based digital divide, see, e.g., B. Mikołajczyk, ‘Universal human rights instruments and digital literacy of older persons’ (2022) 27 The International Journal of Human Rights 3, 403–24; T. N. Friemel, ‘The digital divide has grown old: determinants of a digital divide among seniors’ (2016) 18 New Media and Society 2, 313–31; M. Sourbati, ‘“It could be useful, but not for me at the moment”: older people, internet access and e-public service provision’ (2009) 11 New Media and Society 7, 1083–100; B. Jæger, ‘Trapped in the digital divide? Old people in the information society’ (2004) 17 Science Studies 2, 5–22.

54 See, e.g., I. Mannheim et al., ‘Ageism in the discourse and practice of designing digital technology for older persons: a scoping review’ (2023) 63 The Gerontologist 7, 1188–1200.

55 P. K. Yu, ‘Bridging the Digital Divide: Equality in the Information Age’ (2002) 20 Cardozo Arts and Entertainment Law Journal 1, 1–52, at 15.

56 FRA, Fundamental Rights of Older Persons, p. 6.

57 Charter of Fundamental Rights of the European Union, 26 October 2012, OJ C 326.

58 FRA, Fundamental Rights of Older Persons, p. 8.

59 See further, e.g., M. Sànchez-Valle, ‘The perception of older adults regarding socio-political issues disseminated on social networks’ (2023) 11 Communication for Seniors’ Inclusion in Today’s Society 3, 112–23.

60 R. M. Tappen et al., ‘Digital health information disparities in older adults: a mixed methods study’ (2021) 9 Journal of Racial and Ethnic Health Disparities 1, 82–92.

61 Article 25 of the EU Charter of Fundamental Rights states that EU ‘recognises and respects the rights of the elderly to lead a life of dignity and independence and to participate in social and cultural life – Charter of Fundamental Rights of the European Union, 26 October 2012, OJ C 326.

62 For more on this, see, e.g., A. Seifert, S. R. Cotton, and B. Xie, ‘A double burden of exclusion? Digital and social exclusion of older adults in times of COVID-19’ (2021) 76 The Journals of Gerontology: Series B 3, e99–e103.

63 Y. Wang et al., ‘Digital exclusion and cognitive impairment in older people: findings from five longitudinal studies’ (2024) 24 BMC Geriatrics 406.

64 G. Karaoglu, E. Hargittai, and M. H. Nguyen, ‘Inequality in online job searching in the age of social media’ (2021) 25 Information, Communication and Society 12.

65 UN, Second World Assembly on Ageing, Madrid 8–12 April 2002, Political Declaration and Madrid International Plan of Action on Aging (New York: UN, 2002).

66 E.g., Footnote ibid., para. 40(b).

67 UNDESA, ‘Open-ended Working Group on Ageing for the purpose of strengthening the protection of the human rights of older persons’, https://social.un.org/ageing-working-group/.

68 2022 Rome Ministerial Declaration, ‘A sustainable world for all ages: joining forces for solidarity and equal opportunities throughout life’, https://unece.org/sites/default/files/2022-06/Rome__Ministerial_Declaration.pdf, para. 25.

69 HRC, Resolution 24/20. ‘The human rights of older persons’

70 HRC, ‘Report of the Independent Expert on the enjoyment of human rights by older persons. Robots and rights: the impact of automation on the human rights of older persons’, UN Doc. A/HRC/36/48, 21 July 2021.

71 UN Secretary-General, Roadmap for Digital Cooperation.

72 ITU, ‘Connect 2030 – An agenda to connect all to a better world’, www.itu.int/en/mediacentre/backgrounders/Pages/connect-2030-agenda.aspx, target 2.1.

74 Council of the EU, ‘Human rights, participation and well-being of older persons in the era of digitalisation. Council Conclusions’, EU doc. 11717/2/20 REV 2, 9 October 2020.

75 Web Accessibility Directive: Directive (EU) 2016/2102/EC of the European Parliament and of the Council of 26 October 2016 on the accessibility of the websites and mobile applications of public sector bodies, OJ 2016 L 327. European Accessibility Act: Directive (EU) 2019/882/EC of the European Parliament and of the Council of 17 April 2019 on the accessibility requirements for products and services, OJ 2019 L 151.

76 FRA, Fundamental Rights of Older Persons.

77 See further, e.g., P. Tsatsou, ‘Vulnerable people’s digital inclusion: intersectionality patterns and associated lessons’ (2021) 25 Information, Communication & Society 10, 1475–94.

78 UN Secretary-General, Roadmap for Digital Cooperation (2020), at p. 7.

79 European Parliament resolution of 13 December 2022 on the digital divide: the social differences created by digitalisation (2022/2810(RSP)), para 1.

80 UN Secretary-General, Roadmap for Digital Cooperation (2020), at p. 10.

81 UN Economic and Social Council, Resolution 2021/28. Assessment of the progress made in the implementation of and follow-up to the outcomes of the World Summit on the Information Society, 22 July 2021, UN Doc. E/RES/2021/28.

82 European Parliament resolution of 13 December 2022 on the digital divide: the social differences created by digitalisation (2022/2810(RSP)), para 7.

83 Footnote Ibid., para 15.

84 AGE Platform Europe, ‘Digitalisation and older people: our call to EU policy makers’, 28 June 2024, www.age-platform.eu/content/uploads/2024/07/AGE_Paper-on-Digitalisation-and-Older-People_June-2024_FINAL-1.pdf.

18 How the EU Safeguards Children’s Rights in the Digital Environment An Exploratory Analysis of the EU Digital Services Act and the Artificial Intelligence Act

1 The digital environment is understood as ‘encompassing information and communication technologies (ICTs), including the internet, mobile and associated technologies and devices, as well as digital networks, databases, content and services’, Council of Europe, ‘Recommendation CM/Rec(2018)7 of the Committee of Ministers to Member States on guidelines to respect, protect, and fulfil the rights of the child in the digital environment (2018)’, https://search.coe.int/cm/Pages/result_details.aspx?ObjectId=09000016808b79f7, 12. At the level of the Council of Europe, children enjoy all human rights attributed to individuals under the European Convention on Human Rights. The Steering Committee on the Rights of the Child guides the Council of Europe’s work by advising the Committee of Ministers on appropriate action and proposals concerning the overall priorities regarding children’s rights.

2 UN Committee on the Rights of the Child, ‘General Comment no. 25 on the rights of the child in the digital environment’, CRC/C/GC/25 (2021), https://tbinternet.ohchr.org/_layouts/15/treatybodyexternal/Download.aspx?symbolno=CRC/C/GC/25&Lang=e.

3 European Union Agency for Fundamental Rights, ‘Handbook on European law relating to the rights of the child’ (2015), http://fra.europa.eu/en/publication/2015/handbook-european-law-child-rights, 30.

4 See: that ‘1. Children shall have the right to such protection and care as is necessary for their well-being. They may express their views freely. Such views shall be taken into consideration on matters which concern them in accordance with their age and maturity. 2. In all actions relating to children, whether taken by public authorities or private institutions, the child’s best interests must be a primary consideration. […]’.

5 European Commission, ‘Communication to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions EU strategy on the rights of the child’, COM/2021/142 final, (2021), https://eur-lex.europa.eu/legal-content/en/TXT/?uri=CELEX%3A52021DC0142.

6 See also CJEU Case C-490/20, V.M.A. v. Stolichna obshtina, rayon “Pancharevo” [2021], para. 59: ‘Since Article 24 of the Charter, as the Explanations relating to the Charter of Fundamental Rights note, represents the integration into EU law of the principal rights of the child referred to in the Convention on the rights of the child, which has been ratified by all the Member States, it is necessary, when interpreting that article, to take due account of the provisions of that convention.’

7 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).

8 Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act); Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act).

10 AIA, Recital 28.

11 UNICEF, ‘The state of the world’s children 2017: children in a digital world’ (2017) www.unicef.org/reports/state-worlds-children-2017, 58.

12 For instance, TikTok’s For You feed, see https://newsroom.tiktok.com/en-us/how-tiktok-recommends-videos-for-you.

13 OECD, ‘Children in the digital environment – revised typology of risks’ (2021), www.oecd-ilibrary.org/docserver/9b8f222e-en.pdf?expires=1637159913&id=id&accname=guest&checksum=5C9D56C713550043EBF2F5EAACD4990D,5.

14 M. Schaake, ‘The European Commission’s Artificial Intelligence Act’ (2021) https://hai.stanford.edu/sites/default/files/2021-06/HAI_Issue-Brief_The-European-Commissions-Artificial-Intelligence-Act.pdf, 2.

17 G. Wells, J. Horwitz, and D. Seetharaman, ‘Facebook knows Instagram is toxic for teen girls, company documents show’, The Wall Street Journal, 14 September 2021.

18 For instance, in the context of smart toys, see E. Fosch-Villaronga et al., ‘Toy story or children story? Putting children and their rights at the forefront of the artificial intelligence revolution’ (2021) 38 AI and Society 1, 133–52.

19 B. Kidron, A. Evans, and J. Afia, ‘Disrupted childhood – the cost of persuasive design’ (2018) 5Rights Foundation, https://5rightsfoundation.com/static/5Rights-Disrupted-Childhood.pdf.

20 Fosch-Villaronga et al., ‘Toy story or children story?’, p. 139.

22 In this regard, UNICEF has also warned that an overfocus on the opportunities of AI-systems for children (e.g., for educational purposes) could lead to an underestimation of the risks and challenges such systems may hold for this group. M. Penagos, S. Kassir, S. Vosloo, ‘National AI strategies and children, Reviewing the landscape and identifying windows of opportunity’ (UNICEF, 2020), www.unicef.org/innocenti/documents/national-ai-strategies-and-children.

23 European Commission, ‘Communication to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions EU strategy on the rights of the child’ (2021) COM/2021/142 final, https://eur-lex.europa.eu/legal-content/en/TXT/?uri=CELEX%3A52021DC0142.

24 Schaake, ‘The European Commission’s AIA’, p. 2. This effect entails that companies end up complying with EU regulations even in other countries because it is more practical to have one global approach. Hence, this de facto extends EU laws internationally.

25 European Commission, ‘Proposal for a Regulation of the European Parliament and of the Council on a single market for digital services (Digital Services Act) and amending Directive 2000/31/EC’ (2020) COM/2020/825 final, https://eur-lex.europa.eu/legal-content/en/TXT/?uri=COM%3A2020%3A825%3AFIN, hereafter: Proposal for a DSA.

26 E. Lievens, ‘Is self-regulation failing children and young people? Assessing the use of alternative regulatory instruments in the area of social networks’, in S. Simpson, M. Puppis, and H. van den Bulck (eds.), European Media Policy for the Twenty-First Century: Assessing the Past, Setting Agendas for the Future (New York, Abingdon: Routledge, 2016), pp. 77–94; Recital 75 Digital Services Act.

27 Emphasis added by the authors.

28 Proposal for a DSA, Recital 54.

29 Emphasis added by the authors.

30 Emphasis added by the authors.

31 5Rights Foundation, ‘The Digital Services Act must deliver for children’ (2021), https://5rightsfoundation.com/in-action/the-digital-services-act-must-deliver-for-children.html.

32 European Parliament Committee on Civil Liberties, Justice and Home Affairs, ‘Amendments no. 126–910 Draft Opinion’ (2021) PE692.898v01-00, www.europarl.europa.eu/doceo/document/LIBE-AM-693830_EN.pdf.

33 European Parliament Committee on Civil Liberties, Justice and Home Affairs, ‘Opinion’ (2020) 2020/0361(COD), www.europarl.europa.eu/doceo/document/LIBE-AD-692898_EN.pdf.

34 Council of the European Union, ‘Proposal for a Regulation of the European Parliament and of the Council on a single market for digital services (Digital Services Act) and amending Directive 2000/31/EC’ – general approach’ (2021), 13203/21, https://data.consilium.europa.eu/doc/document/ST-13203-2021-INIT/en/pdf.

35 European Parliament, ‘Amendments adopted by the European Parliament on 20 January 2022 on the proposal for a regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC’ (2022) COM(2020)0825 – C9-0418/2020 – 2020/0361(COD), www.europarl.europa.eu/doceo/document/TA-9-2022-0014_EN.html.

36 Article 29 Working Party, ‘Guidelines on automated individual decision-making and profiling for the purposes of Regulation 2016/679’ (2018), https://ec.europa.eu/newsroom/article29/items/612053, 29.

37 European Parliament, ‘Amendments adopted by the European Parliament on 20 January 2022 on the proposal for a regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC’ (2022) COM(2020)0825 – C9-0418/2020 – 2020/0361(COD), www.europarl.europa.eu/doceo/document/TA-9-2022-0014_EN.html, proposed Recital 52 (‘Online platforms should also not use personal data for commercial purposes related to direct marketing, profiling and behaviourally targeted advertising of minors[; t]he online platform should not be obliged to maintain, acquire or process additional information in order to assess the age of the recipient of the service’) and proposed Article 24.1.b.

38 Footnote Ibid., proposed Article 12.1.c.

39 Footnote Ibid., proposed Article 17.2. and Article 27.1.

40 Footnote Ibid., proposed Article 13.3.

41 E. Lievens (2021) ‘Growing up with digital technologies: how the precautionary principle might contribute to addressing potential serious harm to children’s rights’, 39 Nordic Journal of Human Rights 2, 128–45.

42 Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a single market for digital services and amending Directive 2000/31/EC (Digital Services Act).

43 Footnote Ibid., Recitals 12, 61, 64, 80, 119 and Article 34.

44 Footnote Ibid., Recital 46 and Article 14.3.

45 In the final text of the DSA, the obligations for VLOPs were also extended to very large online search engines, see Footnote ibid., Recital 41, Article 33.

46 H. Vanwynsberghe et al., ‘Onderzoeksrapport Apestaartjaren: de digitale leefwereld van kinderen en jongeren’ (2022), https://assets.mediawijs.be/2022-05/apestaartjaren_2022_210x210_issuu.pdf.

47 European Commission, Third meeting of the European Board for Digital Services, 29 April 2024, https://digital-strategy.ec.europa.eu/en/news/third-meeting-european-board-digital-services. A public consultation and workshops took place in May- June 2025, with the final version expected soon after. For more information, seehttps://digital-strategy.ec.europa.eu/en/library/commission-seeks-feedbackguidelines-protection-minors-online-under-digital-services-act.

48 S. van der Hof et al., ‘The child’s right to protection against economic exploitation in the digital world’ (2020) 28 The International Journal of Children’s Rights 4, 833–59.

49 Article 29 Working Party, ‘Guidelines on automated individual decision-making and profiling for the purposes of Regulation 2016/679’ (2018), https://ec.europa.eu/newsroom/article29/items/612053.

50 UN Committee on the Rights of the Child, ‘General Comment no. 25 on the rights of the child in the digital environment’ (2021) CRC/C/GC/25, https://tbinternet.ohchr.org/_layouts/15/treatybodyexternal/Download.aspx?symbolno=CRC/C/GC/25&Lang=, para. 42.

51 GDPR, Article 5(1)c.

52 European Commission, ‘DSA: very large online platforms and search engines’ (2023), https://digital-strategy.ec.europa.eu/en/policies/dsa-vlops.

53 Lievens, ‘Growing up with digital technologies’.

54 Wells et al., ‘Facebook knows Instagram is toxic’; S. Livingstone et al., ‘Young people experiencing internet-related mental health difficulties: the benefits and risks of digital skills’ (2022), https://doi.org/10.5281/zenodo.7372552.

55 Lievens, ‘Growing up with digital technologies’.

56 E. Lievens, Protecting Children in the Digital Era: The Use of Alternative Regulatory Instruments. International Studies in Human Rights (Leiden, Boston: Brill/Nijhoff, 2010).

57 S. Mukherjee, K. Pothong, and S. Livingstone, ‘Child rights impact assessment: a tool to realise child rights in the digital environment’ (2021), https://digitalfuturescommission.org.uk/wp-content/uploads/2021/03/CRIA-Report.pdf. Additionally, inspiration could also be drawn from methodologies for Data Protection Impact Assessments: see Article 35 GDPR, Article 29 Working Party, ‘Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679’ (2017), https://ec.europa.eu/newsroom/article29/items/611236. However, such impact assessments should take into account the impact on the full range of children’s rights and not be restricted to the impact on the right to privacy or the right to data protection.

58 E. Lievens, ‘The rights of the child in the digital environment: from empowerment to de-responsibilisation’, in 5Rights Foundation, Freedom, Security and Privacy: The Future of Childhood in the Digital World (2020), pp. 152–7.

59 IEEE, ‘IEEE publishes new standard to address age appropriate design for children’s digital services’ (2021), https://standards.ieee.org/news/ieee-2089/.

60 European Commission, ‘Communication to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions – a digital decade for children and youth: the new European strategy for a better internet for kids (BIK+)’ (2022) COM/2022/212 final, https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=COM:2022:212:FIN.

61 See European Commission, ‘Special group on the EU Code of conduct on age-appropriate design’ (2023), https://digital-strategy.ec.europa.eu/en/policies/group-age-appropriate-design.

62 N. A. Smuha, ‘The EU approach to ethics guidelines for trustworthy artificial intelligence’ (2019) SSRN Scholarly Paper ID 3443537, https://papers.ssrn.com/abstract=3443537.

63 European Commission, ‘Communication to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions on fostering a European approach to artificial intelligence’ (2021), https://digital-strategy.ec.europa.eu/en/library/communication-fostering-european-approach-artificial-intelligence.

64 See, for instance, European Parliament, ‘Resolution of 20 October 2020 with recommendations to the Commission on a framework of ethical aspects of artificial intelligence, robotics and related technologies (2020/2012(INL))’, (2020), www.europarl.europa.eu/doceo/document/TA-9-2020-0275_EN.html; Council of the European Union, ‘Council conclusions on shaping Europe’s digital future’ (2020) https://data.consilium.europa.eu/doc/document/ST-8711-2020-INIT/en/pdf.

65 Smuha, ‘The EU approach’, p. 19.

67 AI HLEG, ‘Ethics guidelines for trustworthy AI’ (2019) https://ec.europa.eu/futurium/en/ai-alliance-consultation, 38. Regarding vulnerable persons and groups, HLEG mentions that ‘there is no commonly accepted or widely agreed legal definition of vulnerable persons, due to their heterogeneity. Vulnerability is considered context-specific, and dependent on temporary life events, economic factors, identity. A vulnerable group is a group of persons who share one or several characteristics of vulnerability.’

69 L. Clarke, ‘Here are the most contentious issues in the AI Act debate’, Tech Monitor, 24 September 2021, https://techmonitor.ai/policy/meps-are-preparing-to-debate-europes-ai-act-these-are-the-most-contentious-issues. The submissions mentioned in the footnotes below are available at https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12527-Artificial-intelligence-ethical-and-legal-requirements/public-consultation_nl.

70 W. Holmes, Submission by Nesta (academic/research institution), reference F530284, 14 June 2020.

71 V. Hilmann, Submission by the informal working group, affiliates and alumni of Berkman Klein Centre for Internet & Society at Harvard University (academic/research institution), reference F514778, 27 April 2020.

72 G. Hasselback, Submission by DataEthics.eu (NGO), reference F530283, 14 June 2020.

73 European Commission, ‘Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain union legislative acts (2021) COM(2021)’, https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A52021PC0206, 206 (hereafter ‘proposal for an AIA’); Schaake, ‘The European Commission’s AIA’. The proposal for an AIA is part of a number of proposals that shape the digitalisation in the EU – including the DSA analysed earlier – which should be viewed together. In addition, there are also the Digital Markets Act, the draft Machinery Regulation, and the Data Governance Act.

74 It is based on article 114 TFEU, which allows the EU to regulate those elements of private law which create obstacles to trade in the internal market; R. Manko, European Parliament, and European Parliamentary Research Service, EU Competence in Private Law: The Treaty Framework for a European Private Law and Challenges for Coherence: In-Depth Analysis (European Parliament, 2015), p. 5.

75 Proposal for an AIA, 1.

76 Proposal for an AIA, Preamble, 11.

77 B. Townsend, ‘Decoding the proposed European Union Artificial Intelligence Act’ (2021) 25 American Society of International Law Insights 20, 7.

78 Proposal for an AIA, Explanatory Memorandum, section 5.2.2, 12–13.

79 As cited by M. Veale and F. Zuiderveen Borgesius, ‘Demystifying the draft EU Artificial Intelligence Act’ (2021) 22 Computer Law Review International 4, 97–112.

80 5Rights Foundation, ‘AI systems that put children at risk to be banned under EU’s AI Act’ (2021), https://5rightsfoundation.com/in-action/ai-systems-that-put-children-at-risk-to-be-banned-under-eus-ai-act.html.

81 Such features are particularly appealing to children, as research shows that they reduce children’s autonomy and the likelihood of them self-regulating their media consumption, leading to longer video-viewing times; A. Hiniker et al., ‘Coco’s videos: an empirical investigation of video-player design features and children’s media use’, Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Association for Computing Machinery, 2018), https://doi.org/10.1145/3173574.3173828.

82 E. Gómez, V. Charisi, and S. Chaudron, ‘Evaluating recommender systems with and for children: towards a multi-perspective framework’ (2021), Fifteenth ACM Conference on Recommender Systems, https://ceur-ws.org/Vol-2955/paper2.pdf, 6.

83 Scholars argued that such harms may be difficult to prove or may accumulate without a single event exceeding a threshold for severity of harm; Veale and Zuiderveen Borgesius, ‘Demystifying the draft EU AIA’, 5; D. Keats Citron and D. J Solove, ‘Privacy harms’ (2021) SSRN Scholarly Paper ID 3782222, https://papers.ssrn.com/abstract=3782222. Furthermore, consumer protection organisations have emphasised that the prohibition left out important consumer harms that AI systems can facilitate, such as monetary loss or economic discrimination; BEUC (2021) ‘Regulating AI to protect the consumer: position paper on the AI Act’, www.beuc.eu/publications/beuc-x-2021-088_regulating_ai_to_protect_the_consumer.pdf, 11. This is also problematic in light of children’s right to protection from economic exploitation under Article 32 UNCRC and related rights; S. van der Hof et al., ‘The child’s right to protection’, 833.

84 BEUC, ‘Regulating AI to protect the consumer: Position Paper on the AI Act’ (2021), www.beuc.eu/publications/beuc-x-2021-088_regulating_ai_to_protect_the_consumer.pdf., 11; Veale and Zuiderveen Borgesius, ‘Demystifying the draft EU AIA’.

85 European Economic and Social Committee, ‘Working document section for the single market, production and consumption proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative Acts [COM(2021) 206 Final – 2021/106 (COD)]’ (2021) https://allai.nl/wp-content/uploads/2021/09/INT940-%E2%80%93-EESC-2021-02484-00-00-DT-TRA-EN-Draft-Opinion-final.pdf. ‘The EESC is a consultative body that gives representatives of Europe’s socio-occupational interest groups and others a formal platform to express their points of view on EU issues’; for more information see www.eesc.europa.eu/en/about.

86 Proposal for an AIA, Recital 28.

87 UN Committee on the Rights of the Child, General Comment no. 25.

88 With regard to these conformity assessments, it is important to note the role of standardisation organisations in the process. There are currently three European standardisation organisations that can mandate the development of harmonised standards for such assessments, which providers of AI systems can follow rather than having to interpret the essential requirements themselves. By following the standard, providers can then enjoy a presumption of conformity. For a critical analysis of this mechanism, see Veale and Zuiderveen Borgesius, ‘Demystifying the draft EU AIA’, pp. 14–17.

89 Proposal for an AIA, Article 3(2).

90 Article 9 proposal for an AIA specifies that ‘the risk management system shall consist of a continuous iterative process run throughout the entire lifecycle of a high-risk AI system, requiring regular systematic updating’. It shall comprise the identification and analysis of known and foreseeable risks, evaluation of potentially arising risks and adopting suitable risk management measures.

91 Proposal for an AIA, Article 14(2).

92 In contrast, for users of high-risk AI systems, there are no direct obligations in the proposal for an AIA regarding human oversight. Users have discretion to organise their own resources and activities for the purpose of implementing the human oversight measures indicated by the manufacturer (Article 29 (2), Proposal for an AIA).

93 This has been criticised by the EDPB and EDPS. EDPB and EDPS, ‘Joint Opinion 5/2021 on the proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act)’ (2021) https://edps.europa.eu/system/files/2021-06/2021-06-18-edpb-edps_joint_opinion_ai_regulation_en.pdf., 9.

94 For a discussion of the potential impact of such an AI chatbot on children and young people, see Childnet, ‘Snapchat’s new AI chatbot and its impact on young people’ (2023), www.childnet.com/blog/snapchats-new-ai-chatbot-and-its-impact-on-young-people/#:~:text=What%20is%20“My%20AI”%20and,information%2C%20and%20formulates%20a%20response.

95 For an overview, see Veale and Zuiderveen Borgesius, ‘Demystifying the draft EU AIA’, pp. 17–20.

96 At the time of writing, the IMCO-LIBE Committees had published their draft report, and a vote is scheduled for the first quarter of 2023.

97 Committee on the Internal Market and Consumer Protection Committee on Civil Liberties, Justice and Home Affairs, ‘Draft report on the proposal for a regulation of the European Parliament and of the Council on harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative Acts’ (2022) COM2021/0206 – C9-0146/2021 – 2021/0106(COD) (hereafter ‘IMCO-LIBE Draft Report’).

98 K. van Sparrentak, IMCO shadow rapporteur, during the BEUC seminar ‘The AI Act: machine rule or consumer rules?’; for more information, see www.beuc.eu/ai-act-machine-rule-or-consumer-rules.

99 As part of the risk management system of Article 9, proposal for an AIA.

100 IMCO-LIBE Draft Report, 160.

101 For instance, amendments: 136 on human oversight requirements; 220 containing a requirement for users to enter data in the EU database; 227 on reporting duties for serious incidents or malfunctioning of high-risk AI systems. IMCO-LIBE Draft Report, Footnote Ibid.

102 Council of the European Union, ‘Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts – general approach’(2022), https://data.consilium.europa.eu/doc/document/ST-14954-2022-INIT/en/pdf, 5 (hereafter ‘Council general approach’).

103 G. Malgieri and V. Tiani, ‘How the EU Council is rewriting the AI Act’ (2021), https://brusselsprivacyhub.eu/publications/how-the-eu-council-is-rewriting-the-ai-act.

104 Article 9 2. a) Council general approach.

105 European Parliament, Amendments on the proposal for a regulation of the European Parliament and of the Council on laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts (2023), www.europarl.europa.eu/doceo/document/TA-9-2023-0236_EN.pdf (hereafter ‘EP compromise text’).

106 L. Bertuzzi, ‘AI regulation filled with thousands of amendments in the European Parliament’, 2 June 2022, www.euractiv.com/section/digital/news/ai-regulation-filled-with-thousands-of-amendments-in-the-european-parliament/. E.g., amendments 862 and 1454, which categorised AI systems thatare likely to interact with children as high-risk, and linked to these, amendments 2710 and 2711, which proposed applying the precautionary principle in this context. There were also amendments that clarified that a ‘child’ means any person below the age of eighteen years (amendments 1114 and 1119).

107 Children’s rights organisations have called for the inclusion of children as a specific group that needs protection under the ban on AI systems exploiting vulnerabilities, as per the EC proposal. 5Rights Foundation, ‘AI Act trilogues: the EU’s last chance to protect children’ (2023), https://5rightsfoundation.com/uploads/Coalition-statement_Childrens-rights-in-AI-Act-Trilogues.pdf?_cchid=f599ca7d73d2d3bf41d7204ecc95dabd.

108 EP compromise text, Article 5, para. 1, point d c.

109 Footnote Ibid., Article 69, para. 2; emphasis added.

110 Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act).

111 In relation to this, Recital 29 of the AIA specifies that Article 5 is complementary to the provisions contained in the Unfair Commercial Practices Directive (Directive 2005/29/EC, hereafter UCPD), ‘in particular unfair commercial practices leading to economic or financial harms to consumers are prohibited under all circumstances, irrespective of whether they are put in place through AI systems or otherwise’. The AIA and UCPD will apply alongside each other, and thus the AIA will only have an independent function for practices that are not already covered by the UCPD; C. Bush and A. Fletcher, ‘Harmful online choice architecture’ (2024), Centre on Regulation in Europe, https://cerre.eu/wp-content/uploads/2024/05/CERRE-Final-Report_Harmful-Online-Choice-Architecture.pdf.

112 By analogy, see Bush and Fletcher, ‘Harmful online choice architecture’.

113 Townsend, ‘Decoding the proposed European Union Artificial Intelligence Act’.

114 Fixed in AIA, Annex III.

115 AIA, Recital 56.

116 A. Kelly, ‘A tale of two algorithms: the appeal and repeal of calculated grades systems in England and Ireland in 2020’ (2021) 47 British Educational Research Journal 3, 725–41.

117 AIA, Recital 56.

118 AIA, Recital 58.

119 Amnesty International, ‘Xenophobic machines – discrimination through unregulated use of algorithms in the Dutch childcare benefits scandal’ (2021), www.amnesty.nl/content/uploads/2021/10/20211014_FINAL_Xenophobic-Machines.pdf?x42580.

120 AIA, Articles 7 and 97.

121 BEUC, ‘Regulating AI to protect the consumer: position paper on the AI Act’ (2021), www.beuc.eu/publications/beuc-x-2021-088_regulating_ai_to_protect_the_consumer.pdf, 17.

122 Such an approach is also advocated by BEUC, not in a child-specific manner but more generally in the context of consumer protection: Footnote ibid.

123 Article 50 of AIA also mentions codes of practices to facilitate the effective implementation of these transparency requirements.

124 Article 85 of AIA grants a right to lodge a complaint with a market surveillance authority to ‘any natural or legal person having grounds to consider that the AI Act has been infringed’.

125 AIA, Article 86.

126 See AIA, chapter 4 on implementation, cooperation, penalties, and enforcement.

127 European Commission, ‘Commission opens formal proceedings against TikTok under the Digital Services Act’, 19 February 2024, https://digital-strategy.ec.europa.eu/en/news/commission-opens-formal-proceedings-against-tiktok-under-digital-services-act; European Commission, ‘Commission opens formal proceedings against Facebook and Instagram under the Digital Services Act’, 30 April 2024, https://ec.europa.eu/commission/presscorner/detail/en/ip_24_2373; European Commission, ‘Commission opens formal proceedings against Meta under the Digital Services Act related to the protection of minors on Facebook and Instagram’, 16 May 2024, https://ec.europa.eu/commission/presscorner/detail/en/ip_24_2664.

128 AIA, Recital 48.

129 UN Committee on the Rights of the Child, General Comment no. 14 on the right of the child to have his or her best interests taken as a primary consideration (Art. 3, para. 1), CRC/C/GC/14 (2013) www2.ohchr.org/english/bodies/crc/docs/gc/crc_c_gc_14_eng.pdf; UN Committee on the Rights of the Child, General Comment no. 25, para. 24; Mukherjee, Pothong and Livingstone, ‘Child rights impact assessment’.

130 The European Data Protection Board announced that it was going to adopt guidelines on children’s data in its 2019/2020 Work Programme. Yet up until June 2025 these guidelines have not been published.

19 Right to Education in Regional or Minority Languages Invasions, COVID-19 Pandemic, and Other Developments

1 UN Special Rapporteur on minority issues, Language Rights of Linguistic Minorities – A Practical Guide for Implementation (Geneva: OHCHR, 2017), p. 5.

2 De Varennes claims that at least with respect to private schools ‘there is widespread recognition of this right in legal and political documents, despite some differences in the way it is formulated’. F. de Varennes, The Right to Education and Minority Language (2004), www.researchgate.net/publication/242595391_The_right_to_education_and_minority_language.

3 Declaration on the Rights of Persons Belonging to National or Ethnic, Religious or Linguistic Minorities, GA res. 47/135, annex, 47 U.N. GAOR Supp. (No. 49), 210.

4 P. Thornberry and M.A.M. Estébanez, Minority Rights in Europe (Strasbourg: Council of Europe, 2004), p. 15. The relevant provision reads: ‘States should take appropriate measures so that, wherever possible, persons belonging to minorities may have adequate opportunities to learn their mother tongue or to have instruction in their mother tongue.’

5 International Covenant on Civil and Political Rights, 16 December 1966, 999 UNTS 171. De Varennes and Kuzborska observe: ‘Oddly, UN documents, with the exception of the recent 2017 UN Special Rapporteur on Minority Issues’ Language Rights of Linguistic Minorities: A Practical Guide for Implementation, had never fully explored or explicated this rather fundamental dimension of language rights.’ F. de Varennes and E. Kuzborska, ‘Minority language rights and standards: definitions and applications at the supranational level’, in G. Hogan-Brun and B. O’Rourke (eds.), The Palgrave Handbook of Minority Languages and Communities (London: Palgrave Macmillan, 2019), pp. 21−72, at 40.

6 Human Rights Committee, General comment No. 23(50) (art. 27), CCPR/C/21/Rev.1/Add.5.

8 International Covenant on Economic, Social and Cultural Rights, 16 December 1966, 993 UNTS 3.

9 However, there is no specific mentioning of the word ‘language’ even in the commentary made by the Committee for ESCR. Committee on Economic, Social and Cultural Rights, General comment No. 13: The right to education (Art. 13). (Report No. E/C.12/1999/10). The opposite claim is made in: Z. Bayat, R, Kircher, and H. Van de Velde, ‘Minority language rights to education in international, regional, and domestic regulations and practices: the case of Frisian in the Netherlands’ (2023) 24 Current Issues in Language Planning 1, 81–101.

10 UNESCO Convention against Discrimination in Education, 14 December 1960, 429 UNTS 93.

11 See also comparison in UNESCO, UNESCO Convention against Discrimination in Education (1960) and Articles 13 and 14 (Right to Education) of the International Covenant on Economic, Social and Cultural Rights: A Comparative Analysis (Paris: UNESCO, 2006).

12 UN Convention on the Rights of the Child, 20 November 1989, G.A. res. 44/25, annex, 44 U.N. GAOR Supp. (No. 49). Only the US has not ratified this convention. https://indicators.ohchr.org/.

13 International Convention on the Elimination of All Forms of Racial Discrimination, 7 March 1996, 660 UNTS 195.

14 P. Thornberry, ‘Article 12. Education’, in M. Weller (ed.), The Rights of Minorities in Europe: A Commentary on the European Framework Convention for the Protection of National Minorities (Oxford: Oxford University Press, 2005), pp. 397–412.

15 Application of the International Convention for the Suppression of the Financing of Terrorism and of the International Convention on the Elimination of All Forms of Racial Discrimination (Ukraine v. Russian Federation), ICJ Judgment of 31 January 2024, available at: www.icj-cij.org/sites/default/files/case-related/166/166-20240131-jud-01-00-en.pdf.

16 The claim regarding education in the Crimean Tatar language concerned the quality of such education. The Court concluded that such a claim is not covered by CERD.

17 The provision of Ukrainian language education was still available, so the ICJ rejected the claims that the Russian Federation violated the Court’s Order of 19 April 2017 requiring the Russian Federation to ensure that education in the Ukrainian language remains ‘available’.

18 De Varennes and Kuzborska observe: ‘The prevailing consensus would seem to indicate that while currently human rights do not require the funding of private minority schools unless there is a situation which might be discriminatory, authorities must not prevent the establishment of such schools’ (De Varennes and Kuzborska, ‘Minority language rights and standards’, p. 44). Similarly, M. Paz, ‘The failed promise of language rights: a critique of the international language rights regime’ (2013) 54 Harvard International Law Journal 1, 157–218.

19 Committee of Ministers Resolution CM/Res(2022)2 on the cessation of the membership of the Russian Federation to the Council of Europe (Adopted by the Committee of Ministers on 16 March 2022 at the 1,428th meeting of the Ministers’ Deputies), available at: https://search.coe.int/cm/Pages/result_details.aspx?ObjectID=0900001680a5da51.

20 Complete information available at the Court’s web page, www.echr.coe.int/.

21 Judgments of the European Court of Human Rights, Selection of judgments of the European Court of Human Rights relevant for the protection of national minorities, www.coe.int/en/web/minorities/judgments-of-the-european-court-of-human-rights.

22 Universal Declaration of Human Rights, 10 December 1948, GA res. 217 A.

23 Case ‘Relating to certain aspects of the laws on the use of languages in education in Belgium’ v. Belgium, Application no 1474/62 et al., Judgment of 23 July 1968. The case concerned a specific linguistic situation in Belgium where three languages enjoy official status. According to the national legislation, the language of education shall be Dutch in the Dutch-speaking region, French in the French-speaking region, and German in the German-speaking region. However, in some Dutch-speaking districts there were schools open to French speaking children, depending solely on their residence. The Court concluded, with a very small margin of eight to seven judges, that such practice was discriminatory.

24 Cyprus v. Turkey [GC], Application no. 25781/94, Judgment of 10 May 2001.

25 Footnote Ibid., para. 277. Cyprus has two national languages in the government-controlled area – Greek and Turkish.

26 Valiullina and Others v. Latvia, Applications nos. 56928/19, 7306/20 and 11937/20, Judgment of 14 September 2023.

27 E.g., in Catan and Others v. the Republic of Moldova and Russia [GC], Application no. 43370/04, 8252/05 and 18454/06, Judgment of 19 October 2012.

28 Džibuti and Others v. Latvia, Application no. 225/20 and 2 others, Judgment of 16 November 2023. At the time, the number of private schools in Latvia was insignificant and they mainly operated in Latvian (out of fifty-eight private schools active in 2018/2019, only eleven were teaching in Russian and eighteen were teaching only in Latvian). S. Ganty and D. V. Kochenov, ‘Hijacking human rights to enable punishment by association: Valiullina, Džibuti and outlawing minority schooling in Latvia’, 23 November 2023, Strasbourg Observers, https://strasbourgobservers.com/2023/11/23/hijacking-human-rights-to-enable-punishment-by-association-valiullina-dzibuti-and-outlawing-minority-schooling-in-latvia/.

29 The authorities did not dispute the right to start and run a private school, but submitted that such an educational establishment had to comply with the requirements set by domestic law, which, in the present case, also included requirements as to how much the state language should be used in education (paragraph 86). According to the reform, Latvian should be used for 50 per cent in lower grades up to 100 per cent in final grades, with the exception of subjects related to the minority language, culture and identity, and, obviously, foreign languages.

30 Article 14 of the Convention connects allegations of discrimination with other articles of the Convention and relevant protocols. It requires that all of the rights and freedoms set out in the Convention must be protected and applied without discrimination. Discrimination occurs when a person is treated less favourably than another person in a similar situation and this treatment cannot be objectively and reasonably justified.

31 Latvia is a state party since 2005.

32 The applicants further relied on Article 13 of the FCNM that speaks of the right ‘to set up and to manage their own private educational and training establishments’. However, the Court emphasised that ‘the Framework Convention does not provide for an obligation to finance private schools (see Article 13 § 2 of the Framework Convention)’ and concluded ‘that no positive obligation for the States to subsidise private schools arises from the Convention or its Protocols’ (Džibuti v. Latvia, para. 145).

33 The concept of ‘constitutional identity’ was already accepted in the Grand Chamber judgment in the case of Savickis and Others v. Latvia [GC], Application no. 49270/11, Judgment of 9 June 2022.

34 Advisory Committee on the FCNM, ‘Fourth opinion on Latvia’, 9 October 2023, ACFC/OP/IV(2023)1, https://rm.coe.int/4th-op-latvia-en/1680ae98f6.

35 Council of Europe, National Minorities, ‘The Framework Convention for the Protection of National Minorities’, www.coe.int/en/web/minorities/home.; Council of Europe, ‘European Charter for Regional or Minority Languages’, www.coe.int/en/web/european-charter-regional-or-minority-languages/home.

36 In the Chapman case, the Court observed

that there may be said to be an emerging international consensus amongst the Contracting States of the Council of Europe recognising the special needs of minorities and an obligation to protect their security, identity and lifestyle […] However, the Court is not persuaded that the consensus is sufficiently concrete for it to derive any guidance as to the conduct or standards which Contracting States consider desirable in any particular situation. The Framework Convention, for example, sets out general principles and goals but the signatory States were unable to agree on means of implementation. This reinforces the Court’s view that the complexity and sensitivity of the issues involved in policies balancing the interests of the general population, in particular with regard to environmental protection, and the interests of a minority with possibly conflicting requirements renders the Court’s role a strictly supervisory one.

Chapman v. UK [GC], Application no. 27238/95, Judgment of 18 January 2001, para. 94.

37 Ádám and others v. Romania, Applications nos. 81114/17 and 5 others, Judgment of 13 October 2020.

38 The Russian Federation remained a state party after being expelled from the Council of Europe, but denounced it in 2024. The withdrawal took effect on 1 August 2024.

39 For complete information about their results, visit the web pages noted in Footnote footnote 35.

40 Description of the ECRML can be found in V. Crnić-Grotić and A Oszmiańska-Pagett, ‘The European Charter for Regional or Minority Languages: its origins, structure, and process’ (2022), Linguistic Minorities in Europe Online, www.degruyter.com/database/LME/entry/lme.16303552/html.

41 Council of Europe, Steering Committee on Anti-discrimination, Diversity and Inclusion (CDADI), ‘National minorities’, www.coe.int/en/web/committee-antidiscrimination-diversity-inclusion/national-minorities.

42 EQUINET, ‘Recommendation for a fair and equal Europe: rebuilding our societies after Covid-19’ (2020), https://equineteurope.org/wp-content/uploads/2020/06/equinet_rebuilding-recommendation_A4_03-web.pdf.

43 S. C. Marsal, C. Ahlund, and R. Wilson for CDADI, COVID-19: An Analysis of the Anti-Discrimination, Diversity and Inclusion Dimensions in Council of Europe Member States (Strasbourg: Council of Europe, 2020).

44 Committee of Experts of the European Charter for Regional or Minority Languages, ‘Communication in RMLs of utmost importance in global medical crises’ (2020), https://go.coe.int/siybY.

45 Institute for Public Health, ‘Preporuke za pripadnike romske nacionalne manjine [COVID-19]’ (2020), https://stampar.hr/hr/novosti/preporuke-za-pripadnike-romske-nacionalne-manjine-covid-19.

46 Footnote Ibid. The Boyash Romanian language is spoken by 80 per cent of Roma in Croatia according to some data. F. Radonić Mayr, ‘Bajaški i/ili ćhib, pitanje je sad…’ (2021), https://h-alter.org/kultura/tko-je-romski-jezik/.

47 The closures also had long-term effects. Some gains already made towards the goals of the 2030 Education Agenda were lost: UNESCO, ‘UNESCO’s education response to COVID-19’ (2023), www.unesco.org/en/covid-19/education-response/initiatives. In particular, girls were affected even more by the consequences: UNESCO, ‘When schools shut: gendered impacts of COVID-19 school closures’ (2021), https://unesdoc.unesco.org/ark:/48223/pf0000379270.

48 Online teaching is a well-known concept, but in this case it was ‘emergency online teaching’. Therefore, there was no time or capacity to fully prepare the technical and human resources for such a sudden switch. UNESCO, ‘Distance learning strategies in response to COVID-19 school closures’ (2020), https://unesdoc.unesco.org/ark:/48223/pf0000373305.

49 Soon, however, concerns were raised about the shortcomings of this approach and the negative effects on the mental health of the children in addition to economic inequalities. Z. Blaskó and S. Schnepf, ‘Educational inequalities in Europe and physical school closures during Covid-19’ (2020), European Commission Joint Research Centre, https://knowledge4policy.ec.europa.eu/publication/educational-inequalities-europe-physical-school-closures-during-covid-19_en.

50 Council of Europe, European Charter for Regional or Minority Languages, ‘COMEX statement on RMLs in online education in the context of the COVID-19 pandemic’, 3 July 2020, https://go.coe.int/z7aut.

51 Advisory Committee on the Framework Convention for the Protection of National Minorities, ‘Statement on the COVID-19 pandemic and national minorities’, 28 May 2020, https://rm.coe.int/acfc-statement-covid-19-and-national-minorities-28-05-2020-final-en/16809e8570.

52 Compare the case of Oršuš and Others v. Croatia [GC], Application no. 15766/03, Judgment of 16 March 2010.

53 Reyn Hrvatska, ‘Djeca Romi koja su nastavom na daljinu ostala na obrazovnoj distanci’, 23 April 2020, www.reyn-hrvatska.net/index.php/2020/04/23/djeca-romi-koja-su-nastavom-na-daljinu-ostala-na-obrazovnoj-distanci/.

54 UNICEF Moldova, ‘Roma children – inclusion and reintegration of Roma children into the education system’, www.unicef.org/moldova/en/what-we-do/roma-children.

55 Evaluation on the recommendations for immediate action from the fifth evaluation report on the Slovak Republic, 22 March 2021, https://hudoc.ecrml.coe.int/?i=SK-5th-ERIA-CM-2021-61E-0.

56 Evaluation on the recommendations for immediate action from the fifth evaluation report on the United Kingdom, 22 March 2021, https://hudoc.ecrml.coe.int/?i=UK-5th-ERIA-CM-2021-60E-0.

57 S. Zorčič, ‘Dimensions of remote learning during the Covid-19 pandemic in minority language schools (the case of Austrian Carinthia)’ (2020) 85 Treatises and Documents, Journal of Ethnic Studies 85, 223–52.

58 De Varennes, The Right to Education and Minority Language.

20 Technological Acceleration and the Precarisation of Work Reflections on Social Justice, the Right to Life, and Environmental Education

1 D. Neves, ‘A exploração do trabalho no Brasil contemporâneo’ (2022) 25 Revista Katálysis 1, 11–21.

2 R. Grohmann, ‘Plataformização do trabalho: entre dataficação, financeirização e racionalidade neoliberal’ (2020) 22 Revista EPTIC 1, 106–22.

3 Comisión Económica para América Latina y el Caribe (CEPAL), Tecnologías Digitales para un Nuevo Futuro (LC/TS.2021/43) (Santiago: CEPAL, 2021).

5 R. Grohmann (ed.), Os Laboratórios do Trabalho Digital: Entrevistas (São Paulo: Boitempo Editorial, 2021).

6 Footnote Ibid.; L. C. Abílio, H. Amorim, and R. Grohmann, ‘Uberização e plataformização do trabalho no Brasil: conceitos, processos e formas’ (2021) 23 Sociologias 57, 26–56.

7 Abílio et al., ‘Uberização e plataformização do trabalho no Brasil’.

8 R. Antunes, ‘Capitalismo de plataforma e desantropomorfização do trabalho’, in R. Grohmann (ed.), Os Laboratórios do Trabalho Digital: Entrevistas (São Paulo: Boitempo Editorial, 2021), pp. 33–8.

9 Footnote Ibid., p. 33.

10 Grohmann, Os Laboratórios do Trabalho Digital.

11 Antunes, ‘Capitalismo de plataforma e desantropomorfização do trabalho’.

12 See Centro de Pesquisa em Comunicação e Trabalho (CPCT), ‘Relatório Fairwork Brasil 2023’ (2023), https://comunicacaoetrabalho.eca.usp.br/publicacoes_cpct/relatorio-fairwork-brasil-2023/.

13 L. C. Abílio, ‘Uberização e juventude periférica. Desigualdades, autogerenciamento e novas formas de controle do trabalho’ (2020) 39 Novos Estudos. CEBRAP 3, 579–97; L. C. Abílio, ‘Uberização: a era do trabalhador just-in-time?’ (2020) 34 Revista Estudos Avançados – IEA – USP 98, 111–26; L. C. Abílio, ‘Uberização: manicures, motoboys e a gestão da sobrevivência’, in L. Marques (ed.), Trajetórias da Informalidade no Brasil Contemporâneo (São Paulo: Fundação Perseu Abramo, 2021), pp. 173–91; Abílio et al., ‘Uberização e plataformização do trabalho no Brasil’; Grohmann, ‘Plataformização do trabalho’, Grohmann, Os Laboratórios do Trabalho Digital; and C. N. Rebechi et al., ‘Trabalho decente no contexto das plataformas digitais: uma pesquisa-ação do Projeto Fairwork no Brasil’ (2023) 74 Revista do Serviço Público 2, 370–89.

14 CPCT, ‘Relatório Fairwork Brasil 2023’.

15 L. C. Abilio and S. M. Santiago, Dossiê das violações dos direitos humanos no trabalho uberizado: o caso dos motofretistas na cidade de Campinas (Campinas, SP: UNICAMP/Diretoria Executiva de Direitos Humanos, 2024).

16 Abílio, ‘Uberização’.

17 V. De Stefano, ‘The rise of the “just-in-time workforce”: on-demand work, crowdwork and labour protection in the “gig-economy”’ (2016) 37 Comparative Labor Law Journal 3, 471–504.

18 De Stefano, ‘The rise of the “just-in-time workforce”’.

19 Abílio and Santiago, Dossiê das Violações dos Direitos Humanos No Trabalho Uberizado.

20 T. A. C. Moreira, ‘Gestão algorítmica’, in A. S. P. Oliveira and P. Jerónimo (eds.), Liber Amicorum Benedita Mac Crorie. Volume II (Braga: UMinho Editora, 2022), pp. 551–68.

21 Abílio et al., ‘Uberização e plataformização do trabalho no Brasil’.

22 T. Scholz, Cooperativismo de Plataforma (São Paulo: Elefante, 2016).

25 R. Dagnino, Tecnologia Social: Contribuições Conceituais e Metodológicas (Campina Grande, PB: EDUEPB, 2014).

26 R. Dagnino, Tecnociência Solidária: Um Manual Estratégico (Marília, SP: Lutas Anticapital, 2019).

27 P. Singer, ‘Economia solidária: geração de renda e alternativa ao liberalismo’ (1997) 26 Proposta 72, 7–13.

29 P. Singer, ‘Economia solidária versus economia capitalista’ (2001) 16 Sociedade e estado 1-2, 100–12, at 109.

31 C. A. Alvear, R. Neder, and D. Santini, ‘Economia solidário 2.0: por um cooperativismo de plataforma solidário’ (2023) 9 P2P e Inovação 2, 42–61, at 50.

32 Rebechi et al., ‘Trabalho decente no contexto das plataformas digitais’.

33 H. Rosa, Alienação e Aceleração: Por Uma Teoria Crítica da Temporalidade Tardo-Moderna (São Paulo: Editora Vozes, 2022).

34 Footnote Ibid., p. 20.

37 CPCT, ‘Relatório Fairwork Brasil 2023’, p. 26.

38 M. Reigota, O Que é Educação Ambiental (São Paulo: Brasiliense, 2004).

39 J. S. Leite Lopes, A Ambientalização Dos Conflitos Sociais: Participação e Controle Público da Poluição Industrial (Rio de Janeiro: Relume Dumará, 2004).

40 P. P. Layrargues, ‘Para onde vai a educação ambiental? O cenário político ideológico da educação ambiental brasileira e os desafios de uma agenda política crítica contra-hegemônica’ (2012) 7 Revista contemporânea de educação 14, 388–411; I. C. M. Carvalho, ‘A perspectiva das pedras: considerações sobre os novos materialismos e as epistemologias ecológicas’ (2014) 9 Pesquisa em Educação Ambiental 1, 69–79.

41 C. R. S. Machado, H. Calloni, and G. K. Adomilli, ‘Pensares e fazeres sobre e na Educação Ambiental: reflexões sobre/desde os fundamentos ao campo atual Brasileiro’ (2016) 21 Ambiente e Educação 1, 3–25, at 11.

42 S. L. Pinheiro and F. Pasquier, ‘Consciousness and environmental education: transdisciplinary urgencies from the post-pandemic context’ (2023) 14 Transdisciplinary Journal of Engineering & Science.

43 I. Stengers, No Tempo Das Catástrofes: Resistir à Barbárie que Se Aproxima (São Paulo: Cosac Naify, 2015).

44 H. Acselrad, ‘Os desastres e a ambientalidade crítica do capitalismo’ (2021) 45 Ciência & Trópico 2, 89–103.

45 I. C. M. Carvalho and M. A. A. Ortega, ‘Aprendizagens em tempos de fim de um mundo e de abertura de múltiplos mundos. Reflexões desde a educação ambiental’ (2024) 23 Revista Cocar; Stengers, No Tempo Das Catástrofes; Acselrad, ‘Os desastres e a ambientalidade crítica do capitalismo’.

46 Abílio et al., ‘Uberização e plataformização do trabalho no Brasil’; Grohmann, ‘Plataformização do trabalho’; CPCT, ‘Relatório Fairwork Brasil 2023’.

47 M. Sawada and F. Pasquier, ‘Discourses of Japanese history textbooks: from Doxa to critical thinking’, in M. Sawada and F. Andres, Impacts of Museums on Global Communication (Hershey, PA: IGI Global, 2024), pp. 45−86.

48 C. F. Santos, L. D. Gonçalves, and C. R. Da Silva Machado, ‘Educação ambiental para justiça ambiental: dando mais uns passos’ (2015) 32 REMEA-Revista Eletrônica do Mestrado em Educação Ambiental 1, 189–208.

49 C. F. Loureiro, Educação Ambiental: Questões de Vida (São Paulo: Cortez, 2019).

51 R. V. Bigliardi and R. G. Cruz, ‘As (im)possibilidades de uma sociedade sustentável e o inextricável embricamento entre educação ambiental e direitos humanos’ (2013) 23 Revista Eletrônica do Mestrado em Educação Ambiental 1, 22–35.

52 F. V. Amorim, S. L. Pinheiro, and H. Calloni, ‘Uma ressonância do tempo: os desafios contemporâneos da educação ambiental’ (2019) 14 Revista Pesquisa em Educação Ambiental 1, 48–57.

54 Bigliardi and Cruz, ‘As (im)possibilidades de uma sociedade sustentável’.

55 Loureiro, Educação Ambiental; Bigliardi and Cruz, ‘As (im)possibilidades de uma sociedade sustentável’; Amorim et al., ‘Uma ressonância do tempo’.

56 F. Pasquier, D. Galeffi, and J. Collado-Runao, ‘Educação transdisciplinar para um mundo complexo’, in B. Letellier et al., IIIe Congrès Mondial de la Transdisciplinarité – Adopter un Langage Transdisciplinaire Commun Face à la Complexité du Monde (Paris: Rencontres Transdisciplinaires-PlasticitéS, 2024), Vol. IV, pp. 53–98.

Accessibility standard: WCAG 2.0 A

Why this information is here

This section outlines the accessibility features of this content - including support for screen readers, full keyboard navigation and high-contrast display options. This may not be relevant for you.

Accessibility Information

The HTML of this book conforms to version 2.0 of the Web Content Accessibility Guidelines (WCAG), ensuring core accessibility principles are addressed and meets the basic (A) level of WCAG compliance, addressing essential accessibility barriers.

Content Navigation

Table of contents navigation
Allows you to navigate directly to chapters, sections, or non‐text items through a linked table of contents, reducing the need for extensive scrolling.
Index navigation
Provides an interactive index, letting you go straight to where a term or subject appears in the text without manual searching.

Reading Order & Textual Equivalents

Single logical reading order
You will encounter all content (including footnotes, captions, etc.) in a clear, sequential flow, making it easier to follow with assistive tools like screen readers.

Structural and Technical Features

ARIA roles provided
You gain clarity from ARIA (Accessible Rich Internet Applications) roles and attributes, as they help assistive technologies interpret how each part of the content functions.

Save book to Kindle

To save this book to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×