To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Like in the asymmetric case (cf. Chapter 9), one can consider the generalization of Symmetric TSP where the start and end of the tour that we are looking for are not necessarily identical. Christofides’ algorithm can be generalized to this problem but only yields a 5/3-approximation here.
This chapter contains basic results about this problem and also a further generalization called T-tours; these results will be used in subsequent chapters where we will present better approximation algorithms. One important observation is that the "narrow cuts" of an LP solution have a nice structure.
For unweighted graphs, a 3/2-approximation algorithm can be obtained with the techniques of Chapter 13, or with a simple LP-based approach that we will present in this chapter.
This paper questions how the drive toward introducing artificial intelligence (AI) in all facets of life might endanger certain African ethical values. It argues in the affirmative that indeed two primary values that are prized in nearly all versions of sub-Saharan African ethics (available in the literature) might sit in direct opposition to the fundamental motivation of corporate adoption of AI; these values are Afro-communitarianism grounded on relationality, and human dignity grounded on a normative conception of personhood. This paper offers a unique perspective on AI ethics from the African place, as there is little to no material in the literature that discusses the implications of AI on African ethical values. The paper is divided into two broad sections that are focused on (i) describing the values at risk from AI and (ii) showing how the current use of AI undermines these said values. In conclusion, I suggest how to prioritize these values in working toward the establishment of an African AI ethics framework.
As in the symmetric case, there are two versions of the Asymmetric TSP and two corresponding LP relaxations. They are related to circulations in digraphs. Using again the splitting-off technique, we show that the two versions are equivalent, and we will present a third equivalent version.
We will also study the integrality ratio of the Asymmetric TSP LPs and show that it is at least 2, even for unweighted graph instances.
For NP-hard problems, it is often useful to study relaxations that are easier to solve. In the previous chapter, we already saw two approximation algorithms that started by solving a relaxation: finding a minimum-cost connected spanning subgraph in Christofides’ algorithm and finding a minimum-cost cycle cover in the cycle cover algorithm.
Another kind of relaxation arises by formulating the problem as an integer linear program and dropping the integrality constraints. In this chapter, we will study such linear programming relaxations for Symmetric TSP with Triangle Inequality and Symmetric TSP. These two equivalent versions of the problem give rise to two linear programming relaxations, which turn out to be equivalent as well (by the splitting-off technique). We also study polyhedral descriptions of connectors and T-joins and the integrality ratio of the subtour LP.
In this chapter, we will present an algorithm for the subtour cover problem, which we defined in Chapter 7. This will complete the constant-factor approximation algorithm for the Asymmetric TSP.
The subtour cover problem was introduced (in a slightly different form) by Svensson, Tarnawski, and Végh, who gave a (4,2,1)-algorithm for subtour cover. Traub and Vygen strengthened this to a (3,2,1)-algorithm. Here, we further improve this to a (2,2,1)-algorithm. Our subtour cover algorithm builds on the algorithm for the graph subtour cover problem that we presented in Section 6.2.
As a final result, we obtain a (17+ε)-approximation for the Asymmetric TSP for any fixed ε>0.
Product architecture decisions are made early in the product development process and have far-reaching effects. Unless anticipated through experience or intuition, many of these effects may not be apparent until much later in the development process, making changes to the architecture costly in time, effort and resources. Many researchers through the years have studied various elements of product architecture and their effects. By using a repeatable process for aggregating statements on the effects of architecture strategies from a selection of the literature on the topic and storing them in a systematic database, this information can then be recalled and presented in the form of a Product Architecture Strategy and Effect (PASE) matrix. PASE matrices allow for the identification, comparison, evaluation, and then selection of the most desirable product architecture strategies before expending resources along a specific development path. This paper introduces the PASE Database and matrix and describes their construction and use in guiding design decisions. This paper also provides metrics for understanding the robustness of this database.
While many exact and approximation algorithms work with a linear programming formulation (often a relaxation), the dual LP often plays a key role in the algorithms and their analysis. In this chapter, we analyze the structure of optimum dual solutions for the classical LP relaxations of the TSP, but also for T-joins, and deduce properties like laminarity.
By an efficient uncrossing algorithm and by analyzing extreme point solutions, we obtain optimum primal and dual solutions with linear-size support. Since the primal constraints and dual variables correspond to cuts, enumerating all cuts with a small value is a useful tool in several algorithms.
An, Kleinberg, and Shmoys were the first to beat Christofides’ algorithm for Path TSP. Their algorithm, which they called Best-of-Many Christofides, is very natural: Since an LP solution can be written as convex combination of spanning trees, we can do parity correction on each of these trees and output the best of the resulting tours. It turns out that this yields a better guarantee than the 5/3 that Christofides’ algorithm yields.
In this chapter, we analyze this algorithm and study various follow-up works that have yielded better and better approximation ratios; some of them also apply to general T-tours. This includes a structured decomposition into spanning trees (by Gottschalk and Vygen), Best-of-Many Christofides with lonely edge deletion (by Sebő and van Zuylen), and Traub’s T-tour algorithm.
Society’s most well-intended efforts to solve sustainability challenges have not yet achieved the expected gains due to rebound effects (i.e., negative consequences of interventions arising from induced changes in system behaviour). Rebound effects offset about 40% of potential sustainability gains, but the understanding of design as a key leverage point for preventing rebound effects is still untapped. In this position paper, three fundamental scientific gaps hampering the prevention of rebound effects are discussed: (1) limited knowledge about the rebound effects triggered by efficiency–effectiveness–sufficiency strategies; (2) the influence of the counterintuitive behaviour of complex socio-technical systems in giving rise to rebound effects is not yet understood and (3) the bounded rationality within design limits the understanding of rebound effects at a broader systemic level. To address the aforementioned gaps, novel methodologies, simulation models and strategies to enable the design of reboundless interventions (i.e., products, product/service-systems and socio-technical systems that are resilient to rebound effects) are required. Building on the strong foundation of systems and design theory, this position paper argues for the need to bridge the interdisciplinary gap in the interplay of design and rebound effects, qualitative and quantitative models, engineering and social sciences, and theory and practice.
Nigeria has a significant gender financial inclusion gap with women disproportionately represented among the financially excluded. Artificial intelligence (AI) powered financial technologies (fintech) present distinctive advantages for enhancing women’s inclusion. This includes efficiency gains, reduced transaction costs, and personalized services tailored to women’s needs. Nonetheless, AI harbours a paradox. While it promises to address financial inclusion, it can also inadvertently perpetuate and amplify gender bias. The critical question is thus, how can AI effectively address the challenges of women’s financial exclusion in Nigeria? Using publicly available data, this research undertakes a qualitative analysis of AI-powered Fintech services in Nigeria. Its objective is to understand how innovations in financial services correspond to the needs of potential users like unbanked or underserved women. The research finds that introducing innovative financial services and technology is insufficient to ensure inclusion. Financial inclusion requires the availability, accessibility, affordability, appropriateness, sustainability, and alignment of services with the needs of potential users, and policy-driven strategies that aid inclusion.
After the O(log n)-approximation algorithms for Asymmetric TSP, the first algorithm to beat the cycle cover algorithm by more than a constant factor was found in 2009 by Asadpour, Goemans, Mądry, Oveis Gharan, and Saberi. Their approach is based on finding a "thin" (oriented) spanning tree and then adding edges to obtain a tour. A major open question is how thin trees are guaranteed to exist.
The O(log n/loglog n)-approximation algorithm by Asadpour et al. samples a random spanning tree from the maximum entropy distribution. To show how this works, we discuss interesting connections between random spanning trees and electrical networks. Some results of this chapter will be used again in Chapters 10 and 11.
This chapter is about the proof of the main payment theorem for hierarchies by Karlin, Klein, and Oveis Gharan, a key piece of their better-than-3/2-approximation algorithm for Symmetric TSP. Because the proof is very long and technical, we will not give a complete proof here but rather focus on explaining the key combinatorial ideas.
This chapter is structured as follows. First, we describe the general proof strategy and prove the theorem in an idealized setting. Then we discuss a few crucial properties of λ-uniform distributions. The following sections focus on the main ideas needed to address the hurdles we ignored in the idealized setting described initially.
Finally, we show how the Karlin–Klein–Oveis Gharan algorithm can be derandomized.
In this chapter and Chapter 8, we describe a constant-factor approximation algorithm for the Asymmetric TSP. Such an algorithm was first devised by Svensson, Tarnawski, and Végh. We present the improved version by Traub and Vygen, with an additional improvement that has not been published before.
The overall algorithm consists of four main components, three of which we will present in this chapter. First, we show that we can restrict attention to instances whose cost function is given by a solution to the dual LP with laminar support and an additional strong connectivity property. Second, we reduce such instances to so-called vertebrate pairs. Third, we will adapt Svensson’s algorithm from Chapter 6 to deal with vertebrate pairs. The remaining piece, an algorithm for subtour cover, will be presented in Chapter 8.
By combining the removable pairing technique presented in Chapter 12 with a new approach based on ear-decompositions and matroid intersection, Sebő and Vygen improved the approximation ratio for Graph TSP from 13/9 to 7/5. We will present this algorithm, which is still the best-known approximation algorithm for Graph TSP, in this chapter.
An interesting feature of this algorithm is that it is purely combinatorial, does not need to solve a linear program, and runs in O(n3) time. To describe the algorithm, we review some matching theory, including a theorem of Frank that links ear-decompositions to T-joins. A slight variant of the Graph TSP algorithm is a 4/3-approximation algorithm for finding a smallest 2-edge-connected spanning subgraph, which was the best known for many years. The proofs will also imply corresponding upper bounds on the integrality ratios.
Improved health data governance is urgently needed due to the increasing use of digital technologies that facilitate the collection of health data and growing demand to use that data in artificial intelligence (AI) models that contribute to improving health outcomes. While most of the discussion around health data governance is focused on policy and regulation, we present a practical perspective. We focus on the context of low-resource government health systems, using first-hand experience of the Zanzibar health system as a specific case study, and examine three aspects of data governance: informed consent, data access and security, and data quality. We discuss the barriers to obtaining meaningful informed consent, highlighting the need for more research to determine how to effectively communicate about data and AI and to design effective consent processes. We then report on the process of introducing data access management and information security guidelines into the Zanzibar health system, demonstrating the gaps in capacity and resources that must be addressed during the implementation of a health data governance policy in a low-resource government system. Finally, we discuss the quality of service delivery data in low-resource health systems such as Zanzibar’s, highlighting that a large quantity of data does not necessarily ensure its suitability for AI development. Poor data quality can be addressed to some extent through improved data governance, but the problem is inextricably linked to the weakness of a health system, and therefore AI-quality data cannot be obtained through technological or data governance measures alone.
In the literature, there are polarized views regarding the capabilities of technology to embed societal values. One aisle of the debate contends that technical artifacts are value-neutral since values are not peculiar to inanimate objects. Scholars on the other side of the aisle argue that technologies tend to be value-laden. With the call to embed ethical values in technology, this article explores how AI and other adjacent technologies are designed and developed to foster social justice. Drawing insights from prior studies, this paper identifies seven African moral values considered central to actualizing social justice; of these, two stand out—respect for diversity and ethnic neutrality. By introducing use case analysis along with the Discovery, Translation, and Verification (DTV) framework and validating via Focus Group Discussion, this study revealed novel findings: first, ethical value analysis is best carried out alongside software system analysis. Second, to embed ethics in technology, interdisciplinary expertise is required. Third, the DTV approach combined with the software engineering methodology provides a promising way to embed moral values in technology. Against this backdrop, the two highlighted ethical values—respect for diversity and ethnic neutrality—help ground the pursuit of social justice.
This article constructs the moduli stack of torsion-free $G$-jet-structures in homotopy type theory with one monadic modality. This yields a construction of this moduli stack for any $\infty$-topos equipped with any stable factorization systems.
In the intended applications of this theory, the factorization systems are given by the deRham-Stack construction. Homotopy type theory allows a formulation of this abstract theory with surprisingly low complexity. This is witnessed by the accompanying formalization of large parts of this work.
The EUMigraTool (EMT) provides short-term and mid-term predictions of asylum seekers arriving in the European Union, drawing on multiple sources of public information and with a focus on human rights. After 3 years of development, it has been tested in real environments by 17 NGOs working with migrants in Spain, Italy, and Greece.
This paper will first describe the functionalities, models, and features of the EMT. It will then analyze the main challenges and limitations of developing a tool for non-profit organizations, focusing on issues such as (1) the validation process and accuracy, and (2) the main ethical concerns, including the challenging exploitation plan when the main target group are NGOs.
The overall purpose of this paper is to share the results and lessons learned from the creation of the EMT, and to reflect on the main elements that need to be considered when developing a predictive tool for assisting NGOs in the field of migration.
In the mid to late 19th century, much of Africa was under colonial rule, with the colonisers exercising power over the labour and territory of Africa. However, as much as Africa has predominantly gained independence from traditional colonial rule, another form of colonial rule still dominates the African landscape. This similitude of these different forms of colonialism is found in the power dominance exhibited by Western technological corporations, just like the traditional colonialists. In this digital age, digital colonialism manifests in Africa through the control and ownership of critical digital infrastructure by foreign entities, leading to unequal data flow and asymmetrical power dynamics. This usually occurs under the guise of foreign corporations providing technological assistance to the continent.
By drawing references from the African continent, this article examines the manifestations of digital colonialism and the factors that aid its occurrence on the continent. It further explores the manifestations of digital colonialism in technologies such as Artificial Intelligence (AI) while analysing the occurrence of data exploitation on the continent and the need for African ownership in cultivating the digital future of the African continent. The paper also recognises the benefits linked to the use of AI and makes a cautious approach toward the deployment of AI tools in Africa. It then concludes by recommending the implementation of laws, regulations, and policies that guarantee the inclusiveness, transparency, and ethical values of new technologies, with strategies toward achieving a decolonised digital future on the African continent.
Precise pose estimation is crucial to various robots. In this paper, we present a localization method using correlative scan matching (CSM) technique for indoor mobile robots equipped with 2D-LiDAR to provide precise and fast pose estimation based on the common occupancy map. A pose tracking module and a global localization module are included in our method. On the one hand, the pose tracking module corrects accumulated odometry errors by CSM in the classical Bayesian filtering framework. A low-pass filter associating the predictive pose from odometer with the corrected pose by CSM is applied to improve precision and smoothness of the pose tracking module. On the other hand, our localization method can autonomously detect localization failures with several designed trigger criteria. Once a localization failure occurs, the global localization module can recover correct robot pose quickly by leveraging branch-and-bound method that can minimize the volume of CSM-evaluated candidates. Our localization method has been validated extensively in simulated, public dataset-based, and real environments. The experimental results reveal that the proposed method achieves high-precision, real-time pose estimation, and quick pose retrieve and outperforms other compared methods.