Mobility has a “hallowed place” in the liberal democratic tradition, providing us with what Blomley has described as “one means by which we can examine the uses to which spaces are put in political life and political relations” (Blomley Reference Blomley, Henderson and Waterstone2009, 206). Mobility is, accordingly, inherently tied to governance and provides a window on the expansion and contraction of the fundamental rights, such as gender equality (Walsh Reference Walsh and Divall2015), that shape the human experience.
In this chapter, we look carefully at how people will move through a fully digitized society. We start by examining how turn-by-turn navigation technologies are automating the human task of driving, and, in doing so, have quietly established a footing for algorithmically controlled mobility systems. Whoever controls the algorithms that route mobility within a system gains de facto control over people and their mobility rights, determining who gets access to mobility, how they access it, and numerous decisions about associated benefits (e.g. quality of service, comfort, and time to destination) and risks (e.g. exposure to noise and other pollution, traffic congestion, and discrimination). In other words, whoever controls those algorithms can deliberately and effectively shape our experience of mobility in ways that were previously unheard of, significantly shifting the experience of being human in the digital age.
Early indicators of this shift seem clear. Turn-by-turn navigation is ubiquitous thanks to the proliferation of smart phones, and the algorithms that power it have become increasingly capable of responding to real-time changes in traffic patterns in order to minimize the time it takes to get to our destinations. That ruthless efficiency – the narrow emphasis on saving time – tempts us to rely on turn-by-turn navigation to get us where we want to be, even when driving in familiar places along familiar routes. Turn-by-turn navigation also powers ride hailing services like Uber and Lyft, and will eventually power more fully automated vehicles, thus restricting the set of navigational decisions available to human drivers and human passengers. These trends demonstrate how we are increasingly delegating navigational decision-making to technologies that, in turn, are (partially) automating the actual person behind the wheel.
We argue that we are currently in the early days of algorithmically controlled mobility systems, but that, even if it is nascent in its form and reach, mobility shaping – the act of deliberately and effectively controlling mobility patterns using an algorithmically controlled mobility systemFootnote 1 – is raising a set of unresolved ethical, political, and legal issues that have significant consequences for shaping human experience in the future. The specific subset of questions we focus on in this chapter considers the extent to which the people travelling, the vehicles they use, and the geographic spaces through which they move, ought to be treated neutrally in the algorithmically controlled mobility system. By way of analogy, we argue that these emerging normative questions in mobility echo those that have been asked in the more familiar context of net neutrality. We seek to apply some of the ethical and legal reasoning surrounding net neutrality to the newly relevant algorithmically controlled mobility space, while adding some considerations unique to mobility. We also suggest extending some of the legal and regulatory framework around net neutrality to mobility providers, for the purpose of establishing and ensuring a just set of principles and rules for shaping mobility in ways that promote human flourishing.
Section 11.1 provides a brief historical survey of turn-by-turn navigationFootnote 2 and contextualizes the current socio-technical landscape. Section 11.2 examines the net neutrality controversy and legal rationales designed to ensure technical infrastructure creates political and economic relationships that are fair to people as citizens and rights-holders. Section 11.3 provides a comparative analysis between information networks (e.g. the Internet) and mobility networks, to demonstrate the extent to which the analogy helps us anticipate issues of fairness in algorithmically controlled mobility systems. Finally, Section 11.4 raises an additional set of ethical issues arising from mobility shaping, including the uneven distribution of mobility benefits and risks, the values underpinning navigational choices, and the enclosure of public concerns in private data.
11.1 A Brief History of the Automation of Driving Navigation
Prior to the widespread availability of turn-by-turn navigation apps on smartphones,Footnote 3 most drivers navigated via a combination of memory, instinct, road markers, oral directions, and paper maps. In most cases, each driving and navigation decision was shaped by two forces: the decisions of the person driving the vehicle (e.g. what speed to travel, whether to turn, change lanes, or come to a sudden stop) and the decisions of democratic institutions and administrative bodies that are both populated with people who administer the rules (e.g. to build roads, establish speed limits, and place road signs), and accountable to people as electors. Though these two forces remain relevant, new technologies are changing how people conceptualize mobility navigation. Incredibly detailed digital maps, satellite connectivity, and, most crucially, the enormous uptake of smartphones and other smart devices, have enabled the near-ubiquity of turn-by-turn navigation systems. This section briefly examines the history of automating in-car navigation.
11.1.1 In-Car Navigation, Then
In-car turn-by-turn navigation systems predate the First World War. In the early days of road travel, motorists could purchase after-market devices such as the “Chadwick Road Guide” to aid in the complex task of navigation (French Reference French and Akerman2006). This mechanical invention, first available in 1910, featured an interchangeable perforated metal disc (each corresponding to a specific route) intricately connected to one of the vehicle’s wheels. As the vehicle drove, the disc would turn and activate actions or warnings, such as “continue straight ahead” or “turn sharply to the left.” The driver could thus be “guided over any highway to [their] destination,” with the device “instructing [them] where to turn and [in] which direction” (French Reference French and Akerman2006, 270). However, there were obvious drawbacks to the Chadwick Road Guide and its contemporaries. Most significantly, these devices could only offer a limited number of predetermined routes. Besides that, the devices were complicated, delicate, and relatively expensive.
In-car navigation devices continued to evolve slowly over the next several decades (French Reference French and Akerman2006). Despite improvements, these systems could still not provide real-time information about current driving conditions and lacked accuracy over long distances.
11.1.2 In-Car Navigation, Now
More recently, a suite of technologies, including GPS, digital cameras, cloud computing, vision systems driven by artificial intelligence, and the widespread adoption of smartphones, have enabled the rapid adoption of much more effective navigation systems.Footnote 4 Satellite imagery and computer vision techniques enable the creation of maps so detailed that the fan blades inside rooftop HVAC units can be seen on some buildings in downtown Los Angeles (O’Beirne Reference O’Beirne2017). Additionally, the advent of multiple satellite positioning systems – the Global Navigation Satellite System (GNSS) – coupled with the widespread adoption of wireless communication systems, allows for far more accurate development, update, deployment, and use of maps.
Smartphones have likely resulted in the most significant changes in automating in-car navigation in recent years. Worldwide, approximately 63 per cent of adults owned smartphones in 2017 (Molla Reference Molla2017). In the United States, that number is significantly higher, at 81 per cent as of June 2019 (Pew Research Center 2019). There are now more mobile phones (8.58B) than people (7.95B) on the planet (Richter Reference Richter2023). According to another recent study, over three-quarters of those US smartphone users “regularly” use navigation apps (that is, 77%) (Panko Reference Panko2018).Footnote 5 Eighty-seven per cent of those respondents primarily use the apps for driving directions (as opposed to walking, cycling, public transit, or just as maps), and 64 per cent use the apps while driving (Panko Reference Panko2018).Footnote 6 Additionally, anecdotal experience suggests that drivers use the apps even in neighbourhoods they know, along routes they often travel. The “nudges” they receive from navigation systems can alert them to poor traffic conditions and work out alternative routes if something goes wrong. As drivers incorporate turn-by-turn navigation into their daily driving routines, and delegate navigation decisions to those apps, they are ushering in the age of algorithmically controlled mobility systems.
11.1.3 The NavigationMarketplace
Despite its growing popularity, the costs of up-front investment in the mapping infrastructure means relatively few companies compete in the turn-by-turn navigation market. Alphabet (Google’s parent company) is by far the most significant player in both mapping and turn-by-turn navigation. The Google Maps app, on which 67 per cent of US navigation app users rely, dominates turn-by-turn navigation. Google Maps far outstrips both Apple Maps and the Israeli-founded system Waze, at 11 per cent and 12 per cent, respectively (Panko Reference Panko2018). Moreover, Alphabet purchased Waze in 2013 (Cohan Reference Cohan2013), and so Waze and Google now share the same base map data. Alphabet thus controls nearly 80 per cent of smartphone-assisted turn-by-turn navigation. As a further sign of Alphabet’s dominance, the Google Maps API is currently embedded in more than five million websites, far more than any of its competitors (BuiltWith 2019).Footnote 7
Alphabet has another advantage in the field: the sheer amount of its accumulated data. Google Maps was launched in 2005 and has been collecting mapping data ever since, using aerial photography, satellite images, land vehicles, and individual smartphone data. In 2012, Google had more than 7,000 employees and contractors on its mapping projects, including the Street View cars (Carlson Reference Carlson2012).Footnote 8 Google Maps has more than one billion active users worldwide (Popper Reference Popper2017); At the time of writing, Waze has more than 151 million (Porter Reference Porter2022). Apple Maps, Alphabet’s closest US competitor in the navigation space, has only been active since 2012, and buys its mapping data second-hand, mostly from the Dutch navigation system company TomTom (Reuters 2015). Though expanding, the Europe-based Here WeGo, founded by Nokia and currently owned by a consortium of German car manufacturers, does not yet threaten Alphabet’s dominance (Here Technologies 2019). Likewise, although Uber is conducting mapping projects (Uber 2019), as are Ford and other traditional car manufacturers (Luo Reference Luo2017), Alphabet’s current advantage is undisputed.
Whether or not Alphabet continues to dominate the in-car navigation landscape, turn-by-turn navigation systems will only become more important as connectivity and functionality improve. For example, turn-by-turn navigation has evolved to include other modes of mobility, including walking, cycling, and public transportation, positioning it as the go-to technology for getting around. Fully autonomous vehicles, should they ever come to fruition, will rely on navigation systems to a far greater extent than even the most obedient driver, as they will undoubtedly move within the mobility system according to the rules designed into routing algorithms. Thus, decisions we make now about how to develop, implement, and regulate turn-by-turn navigation systems will fundamentally shape algorithmically controlled mobility systems in the coming decades.
11.2 A Brief History of Net Neutrality
In anticipation of the evolution of mobility towards algorithmically controlled mobility systems, it is useful and instructive to reflect on the net neutrality debates that have shaped our algorithmically controlled information system – also known as the Internet – in the past two decades. The debates surrounding net neutrality are important because they have been an ongoing site for political struggle and the need to infuse tech policy with human-centric policies. We consider net neutrality to be a useful metaphor in thinking about the ethics of algorithmically controlled mobility, primarily because information networks and mobility networks each contain: their own unique units of analysis – packets of information, and packets of people (or mobility); their own paths through the network – wires and roads; and their own control/routing algorithms that determine how to get the packets to their destination. The parallels, and distinctions, between information networks and algorithmically controlled mobility systems can help anticipate and inform ethical design and regulatory responses in the mobility context as they provide a roadmap to encourage designers and policymakers to develop socio-technical design specifications, and consider the ways that regulation can promote or constrain human mobility. This section provides a brief overview of the main technical, political, and ethical issues in net neutrality, including the concepts of “discrimination,” “non-discrimination,” and “neutrality,” and technical and ethical concerns related to Deep Packet Inspection and the legal concept of “common carriage.”
11.2.1 What Are DataPackets?
Data packets, generally consisting of a header and a payload, can be thought of as the basic units of Internet communication. All information sent over the Internet (e.g. emails, movies, cat memes, Instagram posts, and TikToks) is broken up into smaller chunks of data that are packaged up as one or more data packets. If multiple packets are needed to carry the transmitted information, as they usually are, the divisions between packets are made automatically. Each individual packet is then sent to its destination separately, along whatever route is most convenient at the time (Indiana University 2018). Packet headers include high-level routing information, such as the packet’s source, destination IP addresses, and information instructing how to correctly assemble multiple packets together when they reach their destination (Indiana University 2018). The remainder of the packet is referred to as the payload, containing chunks of the transmitted information.
11.2.2 Packet Discrimination and the Emergence of Net Neutrality
Early in the Internet’s history, communications between people were divided into packets of information that travelled from one place to another with very little oversight. In this early “network of Eden” (Parsons Reference Parsons2013, 14), packets were only subjected to Shallow Packet Inspection (SPI) techniques. As the name implies, SPI is designed only to allow network routers to access high-level information about the packet delivery instructions, that is, SPI limits the inspection to the packet headers (Parsons Reference Parsons2013). Thus, an Internet Service Provider (ISP), using network routers designed to limit routing decisions based on SPI, might examine the source IP address of the packet, the packet identification number, or the kind of protocol the specific packet uses, but would not typically have access to the packet content itself. Thus, SPI is used primarily as a routing tool, much like addresses on envelopes travelling through the post.
Because SPI allows for examining destination and source IP addresses, it enables only relatively crude forms of information discrimination, such as blacklisting, firewalling, and others based solely on IP addressing. “Discrimination” in this sense refers to the choices made in routing one packet compared to another. These choices might be automated and algorithmic, or they might be human-driven. Algorithmic discrimination in this sense might be as simple as the automatic “decision” to route a packet along a specific path with no human oversight. Human-driven discrimination in this context, for example, could include a corporate policy of treating packets originating from a source IP that is known to spread viruses as “blacklisted” in the corporate network – that is, preventing untrusted packets from reaching the corporate server as a security measure. SPI-enabled discrimination may have political or moral dimensions; for instance, some corporate firewalls block all packets from social media websites, while government firewalls could prevent citizens from accessing content that challenges the state.
Clearly, these kinds of restrictions could have a significant impact on the humans using the system to communicate. An ISP’s decision to block particular packets, in particular, likely has far-reaching implications for a broad swath of citizens; and could also potentially manipulate packet routing across the Internet to suit its own purposes. For instance, an ISP could prevent its customers from accessing certain websites or could delay certain packets from one website from reaching their destination as quickly as other websites’ packets. Thus, ISP packet discrimination has the potential to preference certain corporate and state interests over others. When it became clear that many ISPs were using packet discrimination to further their own corporate interests, a public controversy erupted over the role of packet discrimination in anti-competitive market manipulation, precisely because it unevenly, thus unfairly, constrained communication opportunities for people using the Internet to share content and information.
Concerns about the anti-competitive nature of ISP packet discrimination led Tim Wu to propose “the principle of network neutrality or non-discrimination” (Reference Wu2002, 1). Net neutrality, as Wu imagines it, is a principle that “distinguish[es] between forbidden grounds of discrimination – those that distort secondary markets, and permissible grounds – those necessary to network administration and to [avoid] harm to the network” (Reference Wu2002, 5). For Wu, forbidden grounds are those based on “internetwork criteria”: “IP addresses; domain names; cookie information; TCP port; and others” that can lead to unfair outcomes for (classes of) individuals (Reference Wu2002, 5). The permissible grounds are limited to local network integrity concerns, in particular, bandwidth and quality of service. As Wu describes, rather than blocking access to bandwidth-intensive applications like online gaming sites, and thus distorting information flow in favour of non-blocked applications, an ISP concerned with net neutrality “would need to invest in policing bandwidth usage” as a means of nudging consumers (Reference Wu2002, 6). The result would be a more even playing field for all network applications, shaped primarily by human communication choices, instead of an artificially influenced market sphere set up for the benefit of those controlling the flow of information.
Wu’s (Reference Wu2002) concept and coinage took off, and was discussed at the highest levels of the US government (Madrigal and LaFrance Reference Madrigal and LaFrance2014). Moreover, net neutrality is now a proxy for deeper ethical and political issues fundamentally tied to the values of human communication, privacy, surveillance, consumer rights, and freedom of speech. This has direct political consequences. Access to information, an important principle at the core of net neutrality, is recognized in Canada as an implied constitutional right (Ontario (Public Safety and Security) v. Criminal Lawyer’s Association 2010). Further, the ability to lawfully access information is a cornerstone of modern democracy: without a well-informed electorate, the health of a democracy is imperilled (Canada (Information Commissioner) v. Canada (Minister of National Defence) 2011). Globally, content discrimination on the Internet is perhaps “the [free speech] issue of our time” (Hattem Reference Hattem2014), creating a space for political action and resistance.
11.2.3 The Rise of Deep Packet Inspection
Further changes to packet inspection technology amplified a broader set of net neutrality concerns. Deep Packet Inspection (DPI) technology, enabled in 2003 by changes in network router design, enables access to the content of the message itself in real-time. Some DPI equipment can monitor hundreds of thousands of packets simultaneously, in effect looking over the shoulder of the people communicating and reading the text of their emails and other communications (Anderson Reference Anderson2007).
The US telecom corporation Comcast provided a striking example of DPI-enabled algorithmic discrimination. In 2007, several public interest groups filed a complaint with the US Federal Communications Commission (FCC), citing Comcast’s practice of secretly “delaying” the transmission of packets from peer-to-peer file-sharing sites (FCC 2008). Comcast argued that severely delaying traffic from these sites was necessary to manage bandwidth requirements, and that earlier rulings and statements from the FCC had merely prohibited outright blocking. The FCC disagreed, holding that the delays in this case were so extreme that they amounted to blocking (FCC 2008). In any event, the FCC noted, “Comcast selectively targeted and terminated the upload connections of … peer-to-peer applications and … this conduct significantly impeded consumers’ ability to access the content and use the applications of their choice” (FCC 2008, para. 44). The FCC ordered Comcast to end its blocking practices in the interest of “the open character and efficient operation of the Internet” (FCC 2008, para. 51).
Although later rulings invalidated the FCC’s order, and have called the Commission’s jurisdiction into question, Comcast adjusted its network management practices so that no specific application, or category of applications, was targeted by its routing algorithms. Rather, network congestion is now managed by slowing down the connections of specific individuals (heavy bandwidth users) during peak usage periods (Comcast Corporation 2008). Although these practices are still discrimination of a sort, they have become commonplace and seem to fall within the permissible grounds identified by Wu (Reference Wu2002).
11.2.4 Common Carriage
The principle of net neutrality is partly based on the idea of “common carriage,” a legal concept that has long roots in the common law. Common carriage speaks to the need to ensure infrastructural systems serve the interests of citizens in ways that are recognized and acknowledged as fair.
Common carriage itself arose from the “common calling”: people engaged in what might be called public service professions, such as innkeepers, barbers, and farriers, could be found liable for refusing service to an individual without reasonable justification (Burdick Reference Burdick1911). Those with a common calling made a “general undertaking” to serve the public at large “in a workmanlike manner,” and any failure to do so left them open to legal action under the law of contract (Burdick Reference Burdick1911, 518).
As technology advanced, the common calling expanded to include “common carriers,” particularly railroads, shipping lines, and other transportation organizations. One defining feature of common carriers, as opposed to common callings, is the up-front infrastructure investment that the former requires. Building a railroad requires massive amounts of start-up capital, time, and (typically) political goodwill. These factors make it difficult for competitors to enter the market, thus limiting both competition and consumer options. If someone wishes to travel but does not wish to pay a certain price for a train ticket, their options are limited. They may find alternative means of transport or choose not to go, but (except in the most exceptional circumstances) they cannot build themselves a railroad. Thus, railroads and other common carriers operate as “virtual monopol[ies]” (Wyman Reference Wyman1904, 161) and, though they are often private companies, they are “in the exercise of a sort of public office, [with] public duties to perform” (New Jersey Steam Navigation Co. v. Merchants’ Bank 1848, 47). As a result, their service should be agnostic with respect to the cargo (and people) they transport.
Though the Internet shares features of common carriers, whether the Internet is considered a common carrier depends on national jurisdiction. Canadian policy, for example, is firmly behind the common carrier model, and the need to ensure all people have fair access to infrastructural services. Because of this political commitment to equal treatment of people, the Canadian Radio-television and Telecommunications Commission (CRTC), which assumed telecommunications control from a variety of bodies in 1968, strongly supports the equal treatment of the data those people communicate “regardless of its source or nature” (CRTC 2017, para. 3).Footnote 9
11.3 Net Neutrality and the Ethics of Mobility Shaping
Net neutrality debates, and the discussion of common carriage principles, alert us to many related problems in algorithmically controlled mobility systems. An algorithmically controlled mobility system recalls the distinction between “forbidden” and “permissible” discrimination of information packets, where people are reduced to mobility packets comprised of specific vehicles and the goods and people within them. Like common carriers, algorithmically controlled mobility systems require substantial up-front investment in both publicly and privately owned and operated infrastructure, creating virtual monopolies (as evidenced by the very few global players in the space and Alphabet’s overwhelming dominance in the market). Indeed, their public–private nature raises complex questions about the governance of algorithmically controlled mobility systems as a public good. Thus, there are many similarities between neutrality in the Internet context and mobility neutrality in the context of algorithmically controlled mobility systems, though there are important distinctions to be drawn as well. This section will examine the applicability of net neutrality concepts to the mobility context in more detail.
11.3.1 Traffic Shaping Is to Information Packets as Mobility Shaping Is to People Packets
To a routing algorithm, there is little difference between a packet of digital information moving through a digital information network (e.g. the Internet) and a packet of people (or goods) moving through a physical mobility network. In an important sense, a map of a digital information network is very similar to a map of a mobility network, with origins, pathways, routing decision points, destinations, and rules about how a packet can move through the system. Just as algorithms control and shape the flow of packets in information systems, often referred to as traffic shaping, algorithms control and shape the flow of people packets through mobility systems, which we refer to as mobility shaping. Thus, there is an ethics of mobility shaping that must be considered when designing the set of rules that govern an algorithmically controlled mobility system. Mobility shaping algorithms, for example, could be designed to move people packets from source to destination according to principles of fairness.
There is a clear analogy to both SPI and DPI in the mobility shaping context. Mobility shaping decisions could be relatively neutral, based only on a set of information containing origin and destination. Mobility shaping could also be complex, intended to support new models of mobility service delivery, and based on detailed information about who (or what) is in the vehicle, such as their socioeconomic status, age, political leaning, gender, purchase history, customer rating, and driving experience preferences, among endless other data breadcrumbs. Indeed, whole new categories of information could be invented to accommodate new forms of mobility shaping. One can imagine different mobility service levels, such as virtual fast lanes, made available to the wealthy (or inaccessible to the poor) or perhaps available to those subscribing to particular loyalty programs (themselves designed to collect more of individuals’ data).
DPI for mobility shaping is already a fact of everyday life and new possibilities for mobility shaping are emerging as more mobility algorithms are designed to take individuals’ data into account. Individual user preferences that shape mobility, say the choice of avoiding tolls or highways, are commonplace features in turn-by-turn navigation apps. But mobile devices and online platforms linking single user accounts together over multiple services (e.g. Google Search, Gmail, Docs, Photos, Maps, and smart phone location data) are enabling the collection and curation of massive datasets applicable to mobility shaping,Footnote 10 significantly upping the ante when it comes to the potential for lived discrimination and unfairness.
Highly granular geo-physical records of a person’s location and movement can reveal many aspects of their life, especially when linked to other data about that person. For example, Waze collects location data and repackages that data as insights into consumer behaviour. A chart on the “Waze For Brands” website displays “Driving Patterns,” showing “when drivers are most likely to visit different business categories” (Waze n.d.). The categories shown are “Auto,” “Coffee,” “Fast Food,” “Fuel,” and “Retail” and can be sorted by day or by hour. From the chart, we learn that the more than 90 million Waze users worldwide are most likely to visit coffee shops between 8 am and 10 am, and are least likely to go to retail stores on Sundays. These particular facts are not earth-shattering revelations, but they signal the trend toward DPI-based mobility shaping models designed to serve private interests rather than the public good. In addition to access to its entire global database, Waze also provides local marketers with multiple advertising strategies. Among other tools, Waze offers Branded Pins with Promoted Search (large branded corporate symbols that appear on the map when the user is within a certain distance of the promoted location) and Zero-Speed Takeovers (large banner ads that cover the Waze interface if the user stops nearby) (Waze 2019). Thus, the “map” shown to Waze users doubles as a promotional engine intended to shape navigation decisions by nudging a driver in a particular direction.
Mobility shaping, though in its infancy, is on the rise; today’s navigational nudges will be tomorrow’s strategies for absolute control over people’s mobility. However, the principles and rules defining forbidden and permissible mobility shaping are as yet undefined. In Section 11.3.2 we consider to what extent the rules used to distinguish between forbidden and permissible information traffic shaping in the net neutrality debate may help us to better understand and protect the importance of regulation to protect individual mobility rights.
11.3.2 Forms of Inclusion, Exclusion, and Discrimination
Today’s turn-by-turn navigation systems, by design, shape mobility by nudging the human user (i.e. driver, cyclist, etc.) to take a certain path to their destination. Generally speaking, today’s systems are designed to minimize the time it would take to reach a chosen destination – but mobility can be algorithmically shaped according to any number of values and preferences other than minimizing time to destination. Turn-by-turn systems also currently shape mobility by nudging people toward certain destinations rather than others, for example by presenting curated lists of options when drivers search for nearby restaurants, gas stations, or other potential destinations. In this sense, mobility shaping functions as a powerful choice architecture, designed to privilege certain values and preferences over others. Mobility shaping can obscure whole categories of routing options or destinations, keeping vehicles on roads designed for high volumes of traffic or out of neighbourhoods where children tend to play in the streets or wealthy homeowners want privacy. Mobility shaping algorithms can also be designed to maximize returns on investment for those corporations heavily invested in the technology, leveraging vast quantities of individual data and preferencing other interests (e.g. corporate) over the needs of people in ways that remain largely opaque to the public. As we march down the road of automating mobility, delegating more decision making to navigation algorithms (which will eventually have broad power over more automated forms of mobility), nudges eventually morph into pushes. At some point in time the idea of people acting as agents entitled to move through space making fine-grained mobility decisions for their own purposes – turning left here instead of right because it’s prettier, switching into this lane versus that one to get a better look at a friend’s new garden, taking Main St. instead of Fifth to avoid passing an ex-partner – fades into the background of the algorithmically controlled mobility system.
Borrowing from Wu’s (Reference Wu2002) net neutrality framework, some of these mobility shaping strategies may be based on permissible discrimination, some may not. Many of them could mimic the discriminatory practices discussed in the net neutrality context, particularly “blocking,” “zero-rating,” and “throttling,” each of which will now be discussed in turn.
11.3.2.1 Blocking
Blocking, in the net neutrality context, is the simple blacklisting of certain Internet destinations (websites). The Fairplay Canada proposal, in 2018, for example, sought to compel ISPs to block access to any site deemed to contain copyright-infringing content (CRTC 2018; O’Rourke Reference O’Rourke2018). Blocking is also sometimes called “filtering” (generally by its proponents) and is used to prevent access to content that is considered illegitimate.
In the mobility shaping context, blocking strategies are easily imagined. Mobility shaping algorithms could blacklist physical destinations, or origins, with varying degrees of interpretability and transparency. On the clearly permissible end of the spectrum, trying to access a restricted destination (such as a military base) could result in a refusal to navigate to that location. In the less straightforwardly permissible spectrum could be situations in which those hailing rides are refused pickups or drop-offs from/to locations that are, for any number of questionable reasons, blacklisted. More subtly, though, certain destinations might simply be left off the map or excluded from the system’s search function, as is the case with the famous Hollywood sign in Los Angeles (Walker Reference Walker2014). Mobility shapers might use such strategies to artificially restrict access to politicized locations (for example, the meeting points for political protests or abortion clinics) that do not align with corporate or state interests. Combined with DPI, blocking could be targeted at individual mobility users, who find themselves excluded from certain destinations, mobility services, routing options, or mobility service levels, and could prove very difficult to detect.
11.3.2.2 Zero-rating
The practice of zero-rating is, in some ways, the inverse of blocking. In the net neutrality context, zero-rating involves exempting certain websites or web resources from bandwidth caps. This practice thus encourages an ISP’s customers to consume the “free” resources instead of content that will increase their data consumption levels. Zero-rating is thus an artificial intervention in a secondary market (that is, in content), and one that often benefits the ISP – particularly when the ISP also provides the content.Footnote 11
“Zero-rating”-like strategies are possible in the mobility shaping context. This is not necessarily a bad thing: as in the net neutrality context, there may be reasonable and valid grounds for prioritizing some routes, services, or locations. For instance, cities might choose to subsidize ride-hailing fares to and from hospitals in order to help people get to the hospital more easily and to save on maintaining expensive parking facilities. There could be benefits to zero-rating airports or central transit hubs or to nudging drivers onto major highways rather than along side streets.
However, there may be times when zero-rating could be less permissible – for instance, if a dominant mobility service provider used zero-rating to dissuade people from accessing services or locations associated with a small competitor’s mobility ecosystem. As an example, Google Maps could offer cheaper fares to users hailing Lyft or Uber rides so long as Google Maps powered them both, thus discriminating against users who choose to hail a ride using a non-Google-powered service. As on the Internet, zero-rating in the mobility context could impermissibly affect a secondary market – in this example, ride-hailing providers – in ways that constrain human agency and fair access to services.
11.3.2.3 Throttling
Throttling, on the Internet, is the practice of selectively and deliberately either improving or degrading the level of service (i.e. the speed of information transfer) between two internet addresses. Throttling can be similar to blocking, but rather than barring a website or web resource outright, an ISP can merely make that resource very slow or difficult to access. In the Comcast Corporation (2008) complaint discussed in Section 11.2.3, one of Comcast’s arguments was that they were not truly “blocking” peer-to-peer transfers but simply “delaying” them. In that case, however, the FCC (2008) determined that Comcast was essentially engaged in blocking because the “delays” were effectively infinite. Yet even shorter delays can have a significant effect: a Google study from 2016 showed that 53 per cent of mobile device users will abandon a website that takes more than 3 seconds to load (Think with Google Reference An2018). More recent data suggests that the “bounce rate” (the number of visitors who leave a site after viewing only one page) increases dramatically with loading times – users are 90 per cent more likely to leave a site that takes 5 seconds to load than a site that only takes 1 second (An Reference An2018). Worse, if loading the site takes 10 seconds, the user is 123 per cent more likely to bounce than if it only takes 1 second (An Reference An2018). Clearly, website throttling need not cause enormous “real-world” delays to have the same effects as outright blocking, with the same consequences for human agency and choice.
In the mobility context, throttling can be thought of as the deliberate manipulation of the time it takes to travel between two locations; it is the algorithmic creation of fast lanes and traffic jams. Throttling on the internet is intended to persuade or dissuade customers from accessing a particular resource by making access to the resource feel either seamless and smooth or frustratingly slow. Mobility shaping by throttling could be as simple as nudging certain drivers into “slow lanes” on a multi-lane freeway with common “stay in the right-hand lane” messages, while nudging privileged individuals into less occupied “fast lanes.” More drastic versions could take certain drivers along completely different routes in order to keep “fast lanes” relatively unoccupied. In an automated driving context where drivers are only there to take over in emergencies, or not at all, systems would simply force vehicles into virtually negotiated fast and slow lanes. As with blocking and zero-rating, certain forms of discrimination by throttling could be deemed permissible, perhaps to support democratically accountable initiatives, or other essential services like first responders. Others could be more difficult to justify: offering fast lanes as a means of rewarding people who purchase particular vehicle brands, and slow lanes, either through access queuing or slower transit times, for people living in low income neighbourhoods, could be deemed impermissible.
11.4 The Need for an Ethics of Mobility Shaping
Given the centralized control that algorithms will (and to an extent already do) exert over various aspects of human mobility, and the differing qualities of mobility service that individuals might be subject to in an algorithmically controlled system, mobility shaping practices thus threaten to exacerbate existing mobility inequalities, while inventing whole new categories of harm to the people who move through space.
Responding to these new ethical challenges will require a more clearly articulated ethics of mobility shaping. In this section we suggest a few general categories of inquiry that we feel could help lay the ethical groundwork for dealing with the specific issues, several of which we have raised, that arise in the context of mobility shaping. Our goal here is to start a conversation, recognizing that much more work is required to flesh these issues out.
11.4.1 The Just Distribution of Mobility Benefits and Harms
As we have described, mobility shaping can result in the uneven and problematic distribution of mobility benefits and harms. As mobility shaping becomes more prevalent and displaces traditional individual driver-determined forms of navigation, it will be important to examine whether any scheme of altering people’s ability to move from place to place results in a permissible or impermissible distribution of those affordances. Those benefits and harms include access to mobility, accessibility of mobility, quality of mobility service, noise and air pollution, vehicle speed and congestion, and the safety of vulnerable road users (e.g. pedestrians and cyclists) (Millar Reference Millar, Lin, Abney and Jenkins2017). Like other distribution problems we face in society, mobility distribution problems, many of which will be created or exacerbated by mobility shaping, should be decided by careful attention to contextual details to avoid problematic constraints on human agency.
11.4.2 Preserving Individual and Collective Mobility Decision Making
Current mobility shaping algorithms are ruthlessly focused on minimizing the time to destination, while ignoring other individual and social values that are likely worth preserving in the mobility context. At times, for example, drivers might prefer a slower, more scenic, or less busy route along a rural road to increase their well-being, rather than travelling through a busy industrial corridor. They might prefer to avoid quiet neighbourhoods where children often play in the streets, in order to improve safety. Yet most turn-by-turn navigation systems do not allow individual drivers to easily adjust their route to accommodate such values-based considerations. As these systems evolve, it might be important to build them in ways to help preserve and amplify the role of human agency in mobility decision making.
This focus on time-efficiency can also disrupt democratic values, especially given the important role that the public space plays in democratic governance. Local citizens have a democratic interest in traffic planning that apps like Waze undermine. The town of Leonia, New Jersey, for instance, is bordered by Interstate 95 and has always struggled with vehicles cutting through town. But with the arrival of Waze and other efficiency-seeking navigation systems, Leonia saw a massive uptick in rush hour traffic, so extreme that many residents could not leave their driveways. In response, Leonia decided to close nearly all of its streets to non-local traffic during rush hour periods, 7 days a week (Foderaro Reference Foderaro2017). Though this might seem like a happy ending for the people of Leonia, it underscores the immediate impact that corporate mobility shaping can have on people’s experience of space and the systems of democratic accountability in which traffic planning decisions are typically made. At the same time, it points to the incredible potential for more democratic forms of algorithmic mobility shaping.Footnote 12
These concerns reflect a significant difference between the Internet and mobility networks. While the Internet began as many disjunct, semi-private networks, albeit ones often constructed with public funding, most roads began as inherently public. Homer’s (1898) Iliad, for instance, refers to moving “along the public way,” and though private roads were known in the Roman Empire, the “main roads … were built, maintained and owned by the State” (Jacobson Reference Jacobson1940, 103).Footnote 13 Toll roads and private roads are still relatively uncommon. As a result, decision-making about roads has been a critical aspect of public discourse for hundreds, if not thousands of years.
In this emerging era of private digital navigational and mapping data and increasingly automated mobility, which function as the metaphorical routers and packets of algorithmically controlled mobility systems, we are confronting the unanticipated privatization of the roads themselves. Though the roads may remain public, that designation could morph into something quite alien relative to our current understanding of mobility, as private interests drape an invisible yet powerful web of algorithmic control over our physical space. Decisions about mobility are being removed from the democratic sphere, and a fundamental restructuring is occurring with little oversight, debate, or explanation. These forms of digital enclosure – of creating a digital fence around ostensibly public roads and structuring people’s mobility within the network – deserve attention, so that we preserve what forms of individual and collective mobility interests are deemed worth preserving, and balance individual, collective, and private interests in mobility more transparently and democratically.
11.5 Conclusion
The age of digital connectivity and mobile computing has brought massive changes to human movement. Human drivers increasingly delegate navigational decision making to apps, thus automating significant aspects of driving and enabling early forms of mobility shaping. The similarities between traffic shaping on the Internet and mobility shaping on physical roadways provide a starting point for examining the ethical and legal challenges that turn-by-turn navigation systems are raising in the public sphere. Yet, although compelling, the parallels between communication networks and mobility networks are not the whole story. As we move towards ever greater algorithmic shaping of our mobility, we must recognize that our ability to move freely in the physical world engages with some of our most fundamental democratic freedoms, that access to mobility uniquely reflects societal values, distinguishing it from information, and demanding a more rigorous investigation of the ethics of mobility that can account for mobility shaping.Footnote 14 This paper hopes to spark those investigations and ensuing debates – now is the time to evaluate the permissibility of different forms of mobility shaping, and to lay the normative foundation for tomorrow’s algorithmically controlled mobility systems.