Hostname: page-component-7f64f4797f-kjzhn Total loading time: 0 Render date: 2025-11-09T10:08:14.466Z Has data issue: false hasContentIssue false

Gender, work, and the anthropomorphisation of technology

Published online by Cambridge University Press:  06 November 2025

Ashwin Varghese*
Affiliation:
Centre of Governance and Human Rights, University of Cambridge , Cambridge, UK
Aishwarya Rajeev
Affiliation:
Liverpool John Moores University, Liverpool, UK
*
Corresponding author: Ashwin Varghese; Email: av690@cam.ac.uk
Rights & Permissions [Opens in a new window]

Abstract

The incorporation of technology, and more recently AI [Artificial Intelligence], into our everyday lives has been progressing at an unprecedented pace. Siri, Alexa, Cortana and various other digital assistants and chatbots populate our everyday interactions for most service-related matters. Acknowledging that technology, work, and social relations are deeply entangled with each other, this paper combines a literature review of anthropomorphisation of AI and emerging technology with a focus on gender and work, and empirical examples drawn from real-world applications and chatbots in the service industry in India, to critically analyse the gendering of technology. We unpack the tendency to ascribe a feminine identity to assistive technology and argue that gendering of emergent assistive technology is performative and relational. It materialises through particularistic manifestations drawing from the sociocultural context. Furthermore, this gendering of technology is co-constituted by the sexual division of labour and gendered norms of work.

Information

Type
Contested Terrains
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of The University of New South Wales

Introduction

Gendered robots, virtual voice assistants, and chatbots are commonplace today. Siri, Alexa, and Cortana, with their default feminine identities, have become common interfaces in personal devices like computers, mobile phones, and home devices. Various chatbots in the digital sphere, from healthcare and hospitality to finance and telecom, are also increasingly being attributed feminine characteristics. Virtual assistants are proliferating digital interactions across the domestic and public spheres, and they are increasingly being mediated through programmed gendered personas. Janet from the sitcom The Good Place and Samantha from the movie Her appear to be prototypes of ‘good’ voice assistants – subservient, docile, cheerful, and patient. Wajcman (Reference Wajcman2000) argues that research has largely focussed on how technology shapes gender relations, even though it is well understood that gender relations and technology are mutually constitutive. We turn the gaze to the other facet of this relationship in our exploration and note how the proliferation of assistive technologies is innately shaped by underlying gender relations.

This paper offers a review of the literature on the anthropomorphisation of AI and emerging technology with a focus on gender and work, to ascertain how it is conceptualising gender and assistive technology. We place this analysis in the empirical context of chatbots used by websites, utilising a case study of the service sector in India. The homepages of commercial and service-oriented websites across sectors, such as railways, banking, finance, jewellery, and hospitality in India, are overwhelmingly adopting chatbots. While each website is designed to be unique, a common feature on the bottom-right corner of the homepage is usually either a virtual assistant characterised as feminine – greeting us with folded hands and a smile, offering her services of assistance in navigating the website – or a gender-neutral bot.

We unpack the tendency to ascribe a feminine identity to assistive technology and argue that gendering of emergent assistive technology is performative and relational. It materialises through particularistic manifestations drawing from the sociocultural context. Furthermore, this gendering of technology is co-constituted by the sexual division of labour and gendered norms of work.

Anthropomorphisation and the gendering of assistive technology

Feminist researchers working in the interstices between science and technology studies on the one hand, and the theorising of gender, work, and organisations on the other, have, since the early stages of computerisation, explored the power relationships embedded in emerging and changing patterns of embodied work. Acker (Reference Acker1990) focused on the erasure of women in the apparently disembodied office job. Haraway (Reference Haraway1990) made ironic use of the image of the cyborg – the hybrid human/animal/machine body – to challenge gender essentialism, including that of radical feminism. Wacjman (Reference Wajcman2000, Reference Wajcman2010) has critiqued the embodied impacts of technology design and explored how processes of technological change both shape and are shaped by gender power relations. In analysing causalities and relationships between gender relations and AI (Artificial Intelligence), Wacjman and Young (Reference Wajcman, Young, Browne, Cave and Drage2023) argue that its gender impacts will depend not merely on recruiting marginalised women into IT careers but on achieving an upheaval in the culture and politics of tech companies. Thus, the interstices of science and technology studies and feminist research have been spaces for exploration of the relationship between technology and gender relations. Many forms of technology have been found to perpetuate stereotypical gender identities. For instance, Feine et al (Reference Feine, Gnewuch, Morana, Maedche, Følstad, Araujo, Papadopoulos, Law, Granmo, Luger and Brandtzaeg2020), through a review of 1375 chatbots, note that there is a clear and demonstrable gender bias in their design, with a majority being gendered female. In the context of anthropomorphisation and embodiment, Sutko (Reference Sutko2020) points out that the gendering of AI technology marks a key moment in the technological embodiment of gender relations (Wajcman Reference Wajcman2010), in line with an understanding that ‘if technologies are inscribed with gender relations in their design, then the culture of computing is predominantly the culture of the white American male’ (Wajcman Reference Wajcman2000, 459).

In addition, anthropomorphisation is not merely a reflection of social norms, but often a projection of fantasy where gendered technology is programmed to function in a subservient/authoritative manner depending on the task assigned. In this way, Manasi et al (Reference Manasi, Panchanadeswaran and Sours2023) note that in the humanisation of technology, a simultaneous dehumanisation and objectification of women also takes place. Nass et al (Reference Nass, Moon and Green1997) found that even with the removal of all visible gender cues, a gender stereotyping of machines happens merely with the use of gendered verbal cues, arguing that any suggestion of gender triggers such behaviour on behalf of the users. This highlights the role that social relations play in human-machine interactions.

Where does the need to anthropomorphise and gender non-human actors then emerge from? Literature has noted that anthropomorphising technology makes it more acceptable for commercial usage and an enhanced user experience. Gibbons et al (Reference Gibbons, Mugunthan and Nielsen2023) show that two key features drive this behaviour: the notion that AI will function better (functionality) and that the experience will be more pleasant (connection). However, in this process of anthropomorphisation, one must question why the emphasis on functionality and connection lends itself to a projection of AI as feminine.

Demands of capitalism with its profit incentive are overdetermined by dominant social modes of production and the sexual division of labour. It is the confluence of these that attributes characteristics like authority to masculinity and assistance to femininity. We argue that the social systems that underlie these associations determine the gendering of technology as well. These social systems are also resistant to the changes that a more gender-representative workforce may be able to bring. Gendering of assistive technology follows dominant patriarchal norms, but is also reflective of the conditions of capitalism. It is amply demonstrated that assistive AI technologies when performing reproductive work, like domestic activities through Alexa, are predominantly gendered female.

Scholars argue that while domestic AI renders such gendered reproductive labour visible, its attribution to a friendly robot is not necessarily emancipatory (Schiller and McMahon Reference Schiller and McMahon2019). Drawing from Schiller and McMahon (Reference Schiller and McMahon2019), we note that the gendering of assistive technology mixes labour with affect. The gendering of assistive technology beyond the domestic sphere, such as in hospitality, health, administration and so on, demonstrates Hochschild’s (Reference Hochschild1983) observation that emotional labour in the public sphere embodies the attempts by circuits of capitalism to assign the warmth, comfort, hospitality, and intimacy expected of women in the home to customers in the public sphere (Schiller and McMahon Reference Schiller and McMahon2019). Crucially, the subservient and cheerful demeanour attributed to assistive technology, even in the face of abuse and violence (UNESCO and EQUALS Skills Coalition 2019), reinforces fantasies of domination and subordination in a gendered configuration of assistive technology.

Janet from The Good Place is almost the prototype for assistive technology, responding only when spoken to, docile, subservient, cheerful, patient, and feminine. As Schiller and McMahon (Reference Schiller and McMahon2019) note, interestingly, she is not a domestic AI but rather a ‘social server’, shouldering the burden of managing everything. We see a similar tendency emerging in the gendering of assistive technology across the domestic and public sphere, wherein managerial and assistive tasks are visualised as feminine, deemed mechanical and perpetuated in the fantasy of docility.

In an analysis of the movies Eve and Her, both of which feature female AI characters, Sutko (Reference Sutko2020) shows how the embodiment of technologies and their naturalisation is built on their being assigned a marginalised gender (female) and ‘the cultural baggage of domesticity, docility, compliance, caring, nurturing, and sexual availability attached to that subjectivity’ (581). Evolving from the menacing Hal from 2001: A Space Odyssey to the kind, docile, and patient Samantha from Her, the cultural imagination of assistive technology has also transformed to make it more palatable to everyday use. Fryxell (Reference Fryxell2020) highlights that the feminine gendering of AI has historically assuaged fears about technological encroachment in our everyday lives while simultaneously idealising forms of femininity which reinstate women’s essentialised social and sexual roles. Woods (Reference Woods2018) argues that feminisation of AI virtual assistants plays an important role in making the proliferation of surveillance technology palatable to the masses, as Alexa, Siri, Cortana, and Google Assistant simultaneously collect massive amounts of personal data. Woods (Reference Woods2018) further notes that the feminised programmed persona of domestic AI virtual assistants is used to entice users to dispense their personal data with intimate details, thereby, in a double movement, making surveillance palatable and reinforcing patriarchal gender norms. This further underscores the mutually constitutive relationship between gender relations and assistive technology.

Assistive technology and gendered work relations

Since assistive technology is necessarily interactive, we may draw parallels to care work and emotional labour, which are also inherently relational as well as gendered. A key element overdetermining such forms of work is the sexual division of labour, which has transformed in many ways over the years. According to Hartmann (Reference Hartmann1976), occupational segregation allowed the subordination of women in a capitalist system, enabling men to wield power through their ‘superior’ status in both the labour market and the household. Women’s subordination and exploitation, across various forms of paid, underpaid, and unpaid work, have been carried out in myriad ways, including occupational segregation and gender wage gaps, all rooted in the sexual division of labour (Ghosh Reference Ghosh2012; Hartmann Reference Hartmann1976; Beneria Reference Benería1979). Women were seen as secondary earners, and this socialisation into the sex-role theory began from the site of the family (Beneria Reference Benería1979). In today’s reality, we see that unpaid work is primarily considered to be the woman’s responsibility, whether or not she engages in paid work outside the household (Folbre Reference Folbre2021). In fact, as Bittman et al (Reference Bittmann, England, Sayer, Folbre and Matheson2003) have shown, in situations when women earn more than their male partners, they may take on a larger share of household work, while men may take on less, owing to cultural and traditional norms of what is ‘masculine’ and ‘feminine’. Employment and unpaid work in the household have had a constant push-and-pull relationship for women, where even though employment reduces the possibility of reducing the burden of unpaid work, it is not able to free women from the stranglehold of the sexual division of labour, resulting in the double/triple burden of work (Folbre Reference Folbre2021; Rajeev and Sinha Reference Rajeev and Sinha2023).

As Federici (Reference Federici2004) rightly states, the sexual division of labour is not merely the difference in the nature of work that men and women are meant to do, it is a marked difference in their lives, experiences, and thoughts as well; it is a power relation. Moreover, public policy/development discourses and state interventions have, across the world, propagated or retained the sex-role theory and the sexual division of labour, further cementing these social relations (Kabeer Reference Kabeer1994; Elson Reference Elson and Elson1991). Stereotypically, the public sphere is still considered to be the male sphere, while the female domain is considered to be the household. Associated with this is the notion that women are more adept at care work and activities that involve emotions; that women have a ‘comparative advantage’ in such activities, which involve certain levels of affect, emotions, empathy, and love. In fact, embedded within care work is this emotional labour. As Folbre and Nelson (Reference Folbre and Nelson2000) point out, ‘the word “care” has a dual meaning, on the one hand referring to caring activities, like changing diapers or providing a listening ear, and on the other hand to caring feelings, like those of concern or affection on the part of a caregiver’.

Literature has shown how labour related to affective care and emotional support has been largely undertaken by women, and it has also been argued that such activities also contribute to the ability of the recipient to participate in the production process, thereby forming a set of processes that can be called ‘emotional reproduction’ (Gotby Reference Gotby2019). While the term emotional labour was originally used to refer to how people manage their emotions to fit into a workplace (Hochschild Reference Hochschild1983), subsequently it has come to underscore the gendered connotations of this kind of labour, which is predominantly done by women. It has been argued that emotional labour is stereotypically associated with women because they are seen to be more emotional, expressive, and capable of care and empathy (Taylor and Tyler Reference Taylor and Tyler2000). It has also been, therefore, expected of women that they would be able to take up tasks related to affect and emotions better, thereby designating them to be better suited for caregiving work as well. Butler (Reference Butler1990) has argued that this forms the performative aspect of gender roles, wherein women (and men) are repeatedly performing acts which are expected of their respective genders, buttressed by cultural conditioning. As Brennan (Reference Brennan1992) notes, social reality imposes a feminine identity which conditions an external image towards it. This social reality we note is underpinned by the sexual division of labour. Drawing from this understanding of gendered work, we argue that the gendered identification of emotional labour or the sexual division of emotional labour manifests in the imagination of assistive technology as female.

Particularistic manifestations

Having outlined the emerging tendency of anthropomorphisation and gendering of assistive technology and unpacked the links between assistive technology and the sexual division of labour, we now draw from empirical examples to explore particularistic manifestations of assistive technology. Chatbots in the service industry, such as health, hospitality, banking, and telecom sectors, are frequently anthropomorphised to culturally specific forms. These manifestations function in interesting combinations. For our analysis, we reviewed 65 service sector industry websites in India, ranging from Banking, Finance and Insurance, Airlines and Railways, Hospitals and Healthcare, Hotels, Telecom, Jewellery, and Automobiles, to assess the anthropomorphic qualities and gendered personas of the chatbots (see Table 1). Out of the 65 websites, 43 had chatbots, and these were selected for analysis. We found that many companies retain the identity of the chatbot as either a bot or merely the brand logo. But the dominant trend, wherever gendering was taking place, was to anthropomorphise and gender the persona as female, often adorned in a saree, with hair neatly tied up into a bun, greeting users with folded hands and a smile. From the 43 chatbots identified, 18 were designed as female, 23 as gender-neutral, and only 2 as male. With the exception of the banking, finance and insurance sector, no other sector had a masculine chatbot. Even in the banking sector, only 2 out of the 15 chatbots reviewed were male, with the remaining 8 being feminine and 5 being gender-neutral bots. Thus, the gendering of assistive technology in the service industry in India takes on specific cultural manifestations or evades anthropomorphisation completely. We note that such gendering of emergent technology with cultural manifestations is both representing and reinforcing cultural norms of femininity while simultaneously projecting women into assistive roles in the logic of capitalism, embodying the sexual division of labour. In short, the assistive role is either imagined and projected to be mechanical or feminine, but rarely masculine.

Table 1. The gendered nature of assistive technology

Such culture-specific manifestations, in particularistic contexts, align with global observations that find a gendered configuration of assistive technology. A UNESCO and IRCAI (2024) report noted that a strong bias exists in large language models (LLMs) where gendered names were associated with traditional career and family roles, where female names were associated with ‘home’, ‘family’, ‘children’, and ‘marriage’; while male names were associated with ‘business’, ‘executive’, ‘salary’, and ‘career’. The results indicated a partiality towards stereotypical gender roles, where the model was significantly more likely to link gendered names with traditional roles, underlining a deep-seated bias in how LLMs represent gender in relation to careers. One may note that these biases are overdetermined by social norms, which feature heavily in the anthropomorphisation of assistive technology. Studies have found that AI-generated images and image search results for professionals in high-paying careers like law, medicine, and programming versus those for nurses and teachers reflect a proportionately higher number of male and female depictions, respectively, indicating the permeation of norms built on sex-segregation of occupation (Gorska and Jemielniak Reference Gorska and Jemielniak2023; Feng and Shah Reference Feng and Shah2022).

Companies like Apple and Google suggest that the intention of gendering is derived from customer preference, where they cite academic literature indicating this preference. In these arguments, the logic is profit-oriented, obfuscating the inherent gender dynamics of commercialising female attributes for the profit motive. Research looking into questions of preference has shown that people prefer a low-pitched masculine voice for assistive technology when it is making ‘authoritative’ statements and a female voice when it is being ‘helpful’ (UNESCO and EQUALS Skills Coalition 2019). It was found that people’s preference for a particular gender was linked to a desire to use assistive technology in a manner where they could be ‘bosses of it’ and they wanted the device to be ‘supportive, helpful and humble’. This shows that the preference for gender in the context of assistive technology is relational and has more to do with nature of assistive work. This being an intentional marketing strategy hints at the sexual division of labour overdetermining the commercial practices of anthropomorphisation of emerging technology. It is not determined by ‘consumer preferences’ alone. Gendering of technology as feminine is not universally accepted either; for instance, the GPS navigation system with a female voice in the 1990s BMW 5 series was recalled because people complained about receiving authoritative instructions from women. On the other hand, IBM’s Watson computer that won jeopardy was unambiguously male (UNESCO and EQUALS Skills Coalition 2019). It has been argued that this bias emerges from the narrow subset of people who design these systems, and these subsets tend to be overwhelmingly male. In short, it is claimed that biases exist because they are designed by men. While this is true, we argue that it is imperative to locate these forms of technology within the larger context of the sexual division of labour and gendered work relations.

It has also been argued that constantly representing digital assistants as female by default ‘hard-codes’ a connection between women and subservience (UNESCO and EQUALS Skills Coalition 2019). However, as we have shown, this hard-coding is much more than an unintended consequence. Non-human voice assistants like Siri and Alexa have become the most ‘recognised’ women globally (UNESCO and EQUALS Skills Coalition 2019), and they are in their own ways socialising gendered behaviours. Digital assistants may then be seen as simultaneous projections of patriarchy and capitalism because not only do they reproduce patriarchal gender relations by their sheer scope and scale, but they also reinforce patriarchal gendered norms of work and labour.

Conclusion

We can see a clear synergy in the process of gendering of virtual assistants and emerging technology, which is increasingly characterised by anthropomorphisation. Gender is performed through particularistic interactive social relations. These social relations overdetermine the gendering of technology, where the gender of digital assistants, predominantly female, emerges not only from the sound or female appearing attributes, but from the mode of conversation and more importantly, the tasks that are assigned to these feminised technologies, i.e., assistive tasks, stereotypically ‘suitable’ for women. To counter the gender bias in assistive technologies, a gender-representative workforce that designs these technologies – as the UNESCO and EQUALS Skills Coalition (2019) report recommends – is a necessary starting point. But that alone may not efface the biased and problematic gendering of technology, which draws from the sexual division of labour.

In this article, we reviewed some scholarly literature which highlights that anthropomorphisation has made technology acceptable for commercial use. Gendering of this technology, is a deliberate process of anthropomorphisation. Drawing from literature on gendering of assistive technology as well as gendered labour and work, we showed that deliberate gendering for commercial interests intends to make emergent technology palatable and enticing, while simultaneously reflecting and reinforcing gendered patriarchal norms of work. This gendering manifests differently in particularistic contexts and may take various shapes, as we demonstrate through the example of India, where chatbots are ‘Indianised’ in their attire and modes of greeting to reflect more indigenous performances of assistance. By linking these particularistic manifestations with global observations, we unpacked how the gendering of technology is co-constituted by the sexual division of labour and gendered norms of work. Utilising the framework of sexual division of labour and foregrounding the context of gendered work relations may, therefore, allow us to fully comprehend the particularistic manifestations of gendered assistive technology.

References

Acker, J (1990) Hierarchies, jobs, bodies: A theory of gendered organizations. Gender & Society 4(2), 139158. https://doi.org/10.1177/089124390004002002 CrossRefGoogle Scholar
Benería, L (1979) Reproduction, production and the sexual division of labour. Cambridge Journal of Economics 3(3), 203225.Google ScholarPubMed
Bittmann, M, England, P, Sayer, L, Folbre, N and Matheson, G (2003) When does gender trump money? Bargaining and time in household work. American Journal of Sociology 109(1), 186214. http://www.jstor.org/stable/10.1086/378341 10.1086/378341CrossRefGoogle Scholar
Brennan, T (1992) The Interpretation of the Flesh: Freud and Femininity. London and New York: Routledge.Google Scholar
Butler, J (1990) Gender Trouble: Feminism and the Subversion of Identity. New York and London: Routledge.Google Scholar
Elson, D (1991) Male bias in the development process: An overview. In Elson, D (ed), Male Bias in the Development Process. Manchester: Manchester University Press, 115.Google Scholar
Federici, S (2004) Caliban and the Witch: Women, the Body and Primitive Accumulation. Brooklyn, NY: Autonomedia.Google Scholar
Feine, J, Gnewuch, U, Morana, S and Maedche, A (2020) Gender bias in chatbot design. In Følstad, A, Araujo, T, Papadopoulos, S, Law, EL, Granmo, OC, Luger, E and Brandtzaeg, PB (eds), Chatbot Research and Design. CONVERSATIONS 2019. Lecture Notes in Computer Science (Vol. 11970). Cham: Springer. https://doi.org/10.1007/978-3-030-39540-7_6 Google Scholar
Feng, Y and Shah, C (2022) Has CEO gender bias really been fixed? Adversarial attacking and improving gender fairness in image search. Proceedings of the AAAI Conference on Artificial Intelligence 36(11), 1188211890.10.1609/aaai.v36i11.21445CrossRefGoogle Scholar
Folbre, N (2021) The Rise and Decline of Patriarchal Systems: An Intersectional Political Economy. London: Verso Google Scholar
Folbre, N and Nelson, J (2000) For love or money - or both?. The Journal of Economic Perspectives 14(4), 123140. http://www.jstor.org/stable/2647078 10.1257/jep.14.4.123CrossRefGoogle Scholar
Fryxell, A (2020) Artificial Eve: The modernist origins of AI’s gender problem. https://doi.org/10.17863/CAM.57402 CrossRefGoogle Scholar
Ghosh, J (2012) Women, labour and capital accumulation in India. Monthly Review. https://monthlyreview.org/2012/01/01/women-labor-and-capital-accumulation-in-asia/ (accessed 1 May 2025).10.14452/MR-063-08-2012-01_1CrossRefGoogle Scholar
Gibbons, S, Mugunthan, T and Nielsen, J (2023) The 4 degrees of anthropomorphism of generative AI. NN/g. https://www.nngroup.com/articles/anthropomorphism/ (accessed 25 May 2025).Google Scholar
Gotby, A (2019) They call it love: Wages for housework and emotional reproduction. PhD dissertation, University of West London.Google Scholar
Gorska, AM and Jemielniak, D (2023) The invisible women: Uncovering gender bias in AI-generated images of professionals. Feminist Media Studies 23(8), 43704375. https://doi.org/10.1080/14680777.2023.2263659 CrossRefGoogle Scholar
Haraway, D (1990) Simians, Cyborgs, and Women: The Reinvention of Nature. New York: Routledge.Google Scholar
Hartmann, H (1976) Capitalism, patriarchy and job segregation by sex. Signs 1(3), 137169. https://doi.org/10.1086/493283 CrossRefGoogle Scholar
Hochschild, AR (1983) The Managed Heart. California: University of California Press.Google Scholar
Kabeer, N (1994) Reversed Realities: Gender Hierarchies in Development Thought. London: Verso.Google Scholar
Manasi, A, Panchanadeswaran, S and Sours, E (2023) Addressing gender bias to achieve ethical AI. IPI Global Observatory. https://theglobalobservatory.org/2023/03/gender-bias-ethical-artificial-intelligence/ (accessed 25 May 2025).Google Scholar
Nass, C, Moon, Y and Green, N (1997) Are machines gender neutral? Gender-stereotypic responses to computers with voices. Journal of Applied Social Psychology 27(12), 864876. https://doi.org/10.1111/j.1559-1816.1997.tb00275.x CrossRefGoogle Scholar
Rajeev, A and Sinha, D (2023) Elucidating the women’s work continuum in India using time-use data. Area Development and Policy 9(2), 181203. https://doi.org/10.1080/23792949.2023.2270044 CrossRefGoogle Scholar
Schiller, A and McMahon, J (2019) Alexa, alert me when the revolution comes: gender, affect and labor in the age of home-based artificial intelligence. New Political Science 41(2), 173191. https://doi.org/10.1080/07393148.2019.1595288 CrossRefGoogle Scholar
Sutko, DM (2020) Theorizing femininity in artificial intelligence: a framework for undoing technology’s gender troubles. Cultural Studies 34(4), 567592. https://doi.org/10.1080/09502386.2019.1671469 CrossRefGoogle Scholar
Taylor, S and Tyler, M (2000) Emotional labour and sexual difference in the airline industry. Work Employment and Society 14(1), 7795. https://doi.org/10.1177/09500170022118275 CrossRefGoogle Scholar
UNESCO and IRCAI (2024) Challenging systematic prejudices: An investigation into gender bias in large language models. https://unesdoc.unesco.org/ark:/48223/pf0000388971 Google Scholar
UNESCO and EQUALS Skills Coalition (2019) I’d blush if I could: Closing gender divides in digital skills through education. https://doi.org/10.54675/RAPC9356 CrossRefGoogle Scholar
Wajcman, J (2000) Reflections on gender and technology studies: In what state is the art?. Social Studies of Science 30(3), 447464.10.1177/030631200030003005CrossRefGoogle Scholar
Wajcman, J (2010) Feminist theories of technology. Cambridge Journal of Economics 34(1), 143152. https://doi.org/10.1093/cje/ben057 CrossRefGoogle Scholar
Wajcman, J and Young, E (2023) Feminism confronts AI: The gender relations of digitalisation. In Browne, J, Cave, S, Drage, E and McInerney (eds), Feminist AI: Critical Perspectives on Algorithms, Data, and Intelligent Machines. Oxford: Oxford Academic. https://doi.org/10.1093/oso/9780192889898.003.0004 Google Scholar
Woods, HS (2018) Asking more of Siri and Alexa: Feminine persona in service of surveillance capitalism. Critical Studies in Media Communication 35(4), 334349. https://doi.org/10.1080/15295036.2018.1488082 CrossRefGoogle Scholar
Figure 0

Table 1. The gendered nature of assistive technology