To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
European Union (EU) public opinion research is a rich field of study. However, as citizens often have little knowledge of the EU it remains the question to what extent their attitudes are grounded in coherent, ideologically informed belief systems. As survey research is not well equipped to study this question, this paper explores the value of the method of cognitive mapping (CM) for public opinion research by studying the cognitive maps of 504 Dutch citizens regarding the Eurozone crisis. The paper shows that respondents perceive the Eurozone crisis predominantly as a governmental debt crisis. Moreover, the concept bureaucracy unexpectedly plays a key role in their belief systems exerting an ambiguous but overall negative effect on the Eurozone and trust in the EU. In contrast to expectation, the attitudes of the respondents are more solidly grounded in (ordoliberal) ideology than that of the Dutch elite. Finally, the paper introduces new ways to measure ambivalence prompting a reevaluation of the significance of different forms of ambivalence and their impact on political behavior. Overall, the results of this study suggest that CM forms a promising addition to the toolbox of public opinion research.
Improving public policies, creating the next generation of AI systems, reducing crime, making hospitals more efficient, addressing climate change, controlling pandemics, and reducing disruption in supply chains are all problems where big picture ideas from analytics science have had large-scale impact. What are those ideas? Who came up with them? Will insights from analytics science help solve even more daunting societal challenges? This book takes readers on an engaging tour of the evolution of analytics science and how it brought together ideas and tools from many different fields – AI, machine learning, data science, OR, optimization, statistics, economics, and more – to make the world a better place. Using these ideas and tools, big picture insights emerge from simplified settings that get at the essence of a problem, leading to superior approaches to complex societal issues. A fascinating read for anyone interested in how problems can be solved by leveraging analytics.
This focused textbook demonstrates cutting-edge concepts at the intersection of machine learning (ML) and wireless communications, providing students with a deep and insightful understanding of this emerging field. It introduces students to a broad array of ML tools for effective wireless system design, and supports them in exploring ways in which future wireless networks can be designed to enable more effective deployment of federated and distributed learning techniques to enable AI systems. Requiring no previous knowledge of ML, this accessible introduction includes over 20 worked examples demonstrating the use of theoretical principles to address real-world challenges, and over 100 end-of-chapter exercises to cement student understanding, including hands-on computational exercises using Python. Accompanied by code supplements and solutions for instructors, this is the ideal textbook for a single-semester senior undergraduate or graduate course for students in electrical engineering, and an invaluable reference for academic researchers and professional engineers in wireless communications.
Design Neurocognition, a field bridging Design Research and Cognitive Neuroscience, offers new insights into the cognitive processes underlying creative ideation. This study adopts a micro-perspective on design ideation by examining convergent and divergent thinking as its core components. Using 32-channel EEG recordings, it investigates how educational background (Industrial Design Engineering vs. Engineering Design) influences designers’neural activity (alpha, beta, and gamma frequency bands), behavioral responses, and perceived stress during ideation tasks. Data from forty participants reveal a consistent and meaningful interaction between brain activity, behavior, and self-reported stress, highlighting that educational background significantly modulates cognitive and neural patterns during ideation. Importantly, perceived stress shows strong negative correlations with neural power across all frequency bands, suggesting a close alignment between subjective experience and physiological measures. By integrating neural, behavioral, and psychological data, this study advances the understanding of the neurocognitive mechanisms driving design ideation and establishes a methodological foundation for bridging Design and Cognitive Neuroscience. These findings contribute to building a unified evidence base for future human-centred and neuro-informed design research.
This article argues that the environmental contexts of memory are vulnerable to Artificial Intelligence (AI)-generated distortions. By addressing the broader ecological implications for AI’s integration into society, this article looks beyond a sociotechnical dimension to explore the potential for AI to complicate environmental memory and its role in shaping human–environment relations. First, I address how the manipulation and falsification of memory risks undermining intergenerational transmission of environmental knowledge. Second, I examine how AI-generated blurring of boundaries between real and unreal can lead to collective inaction on environmental challenges. By identifying memory’s central role in addressing environmental crisis, this article places emerging debates on memory in the AI era in direct conversation with environmental discourse and scholarship.
Since 2017, Digital Twins (DTs) have gained prominence in academic research, with researchers actively conceptualising, prototyping, and implementing DT applications across disciplines. The transformative potential of DTs has also attracted significant private sector investment, leading to substantial advancements in their development. However, their adoption in politics and public administration remains limited. While governments fund extensive DT research, their application in governance is often seen as a long-term prospect rather than an immediate priority, hindering their integration into decision-making and policy implementation. This study bridges the gap between theoretical discussions and practical adoption of DTs in governance. Using the Technology Readiness Level (TRL) and Technology Acceptance Model (TAM) frameworks, we analyse key barriers to adoption, including technological immaturity, limited institutional readiness, and scepticism regarding practical utility. Our research combines a systematic literature review of DT use cases with a case study of Germany, a country characterised by its federal governance structure, strict data privacy regulations, and strong digital innovation agenda. Our findings show that while DTs are widely conceptualised and prototyped in research, their use in governance remains scarce, particularly within federal ministries. Institutional inertia, data privacy concerns, and fragmented governance structures further constrain adoption. We conclude by emphasising the need for targeted pilot projects, clearer governance frameworks, and improved knowledge transfer to integrate DTs into policy planning, crisis management, and data-driven decision-making.
Important concepts from the diverse fields of physics, mathematics, engineering and computer science coalesce in this foundational text on the cutting-edge field of quantum information. Designed for undergraduate and graduate students with any STEM background, and written by a highly experienced author team, this textbook draws on quantum mechanics, number theory, computer science technologies, and more, to delve deeply into learning about qubits, the building blocks of quantum information, and how they are used in quantum computing and quantum algorithms. The pedagogical structure of the chapters features exercises after each section as well as focus boxes, giving students the benefit of additional background and applications without losing sight of the big picture. Recommended further reading and answers to select exercises further support learning. Written in approachable and conversational prose, this text offers a comprehensive treatment of the exciting field of quantum information while remaining accessible to students and researchers within all STEM disciplines.
An improved identification algorithm is adopted to calibrate the kinematic parameters of the serial-parallel robot, which improves the motion accuracy of the end-effector. Firstly, a kinematic model of the serial-parallel robot is constructed based on the closed-loop vector method. Secondly, a kinematic error model is established by combining geometric error analysis with the vector differential method. Then, with the effective separation of compensable and non-compensable error sources, an identification model of kinematic parameters is constructed. Finally, an improved pivot element weighted iterative algorithm is used to identify the geometric error parameters. Through actual pose measurement, MATLAB is used to simulate the calibration process. The simulation and experimental results show that after kinematic calibration, compared with the traditional least squares method, the improved identification algorithm can significantly reduce the end-effector pose error of the serial-parallel robot, thus effectively improving the motion accuracy of the end-effector.
The escalating complexity of global migration patterns renders evident the limitation of traditional reactive governance approaches and the urgent need for anticipatory and forward-thinking strategies. This Special Collection, “Anticipatory Methods in Migration Policy: Forecasting, Foresight, and Other Forward-Looking Methods in Migration Policymaking,” groups scholarly works and practitioners’ contributions dedicated to the state-of-the-art of anticipatory approaches. It showcases significant methodological evolutions, highlighting innovations from advanced quantitative forecasting using Machine Learning to predict displacement, irregular border crossings, and asylum trends, to rich, in-depth insights generated through qualitative foresight, participatory scenario building, and hybrid methodologies that integrate diverse knowledge forms. The contributions collectively emphasize the power of methodological pluralism, address a spectrum of migration drivers, including conflict and climate change, and critically examine the opportunities, ethical imperatives, and governance challenges associated with novel data sources, such as mobile phone data. By focusing on translating predictive insights and foresight into actionable policies and humanitarian action, this collection aims to advance both academic discourse and provide tangible guidance for policymakers and practitioners. It underscores the importance of navigating inherent uncertainties and strengthening ethical frameworks to ensure that innovations in anticipatory migration policy enhance preparedness, resource allocation, and uphold human dignity in an era of increasing global migration.
This study presents an innovative system for upper limb rehabilitation, combining a variable stiffness device, the ReHArm prototype, with a dynamic and engaging user interface, known as Arms Rehabilitation Management System. The proposed system offers a highly customisable approach to rehabilitation, ensuring real-time adaptability to patients’ specific needs while maintaining compactness and ease of use. Key features include a modular design allowing precise stiffness adjustments, a robust control architecture, and interactive rehabilitation phases designed to enhance user engagement. Extensive multidisciplinary analyses, including kinematic, dynamic, and structural evaluations, demonstrate the system’s ability to improve therapy effectiveness through tailored interaction and feedback. Validation tests demonstrated the prototype’s reliability and robustness, and initial usability assessments suggest its potential to improve rehabilitation outcomes. Further clinical studies involving patients will be necessary to fully evaluate its therapeutic effectiveness.
Product configuration is a successful application of answer set programming (ASP). However, challenges are still open for interactive systems to effectively guide users through the configuration process. The aim of our work is to provide an ASP-based solver for interactive configuration that can deal with large-scale industrial configuration problems and that supports intuitive user interfaces (UIs) via an application programming interface (API). In this paper, we focus on improving the performance of automatically completing a partial configuration. Our main contribution enhances the classical incremental approach for multi-shot solving by four different smart expansion functions. The core idea is to determine and add specific objects or associations to the partial configuration by exploiting cautious and brave consequences before checking for the existence of a complete configuration with the current objects in each iteration. This approach limits the number of costly unsatisfiability checks and reduces the search space, thereby improving solving performance. In addition, we present a UI that uses our API and is implemented in ASP.
Answer Set Programming (ASP) is a successful method for solving a range of real-world applications. Despite the availability of fast ASP solvers, computing answer sets demands significant computational resources, since the problem tackled is on the second level of the polynomial hierarchy. Answer set computation can be accelerated if the program is split into two disjoint parts, bottom and top. Thus, the bottom part is evaluated independently of the top part, and the results of the bottom part evaluation are used to simplify the top part. Lifschitz and Turner have introduced the concept of a splitting set, that is, a set of atoms that defines the splitting.
In a previous paper, the notion of g-splitting set, which generalize the concept of splitting sets for disjunctive logic programs, was introduced. In this paper, we further investigate the topic of splitting sets and g-splitting sets. We show that the set inclusion problem for splitting sets can be reduced to a classic Search Problem and solved in polynomial time. We also show that the task of computing g-splitting sets with desirable properties is relatively easy and straightforward. Finally, we show that stable models can be decomposed to models of rules inspired by g-splitting sets and models of the rest of the program. This interesting property can assist in incremental computation of stable models.
With increasing age, many elderly individuals will not be able to stand normally. To solve this problem, a knee exoskeleton is designed. The knee joint is designed as a variable stiffness structure. It can adjust its stiffness according to the body’s movement state, ensuring precise assistance while also enhancing human comfort. The variable stiffness mechanism consists of an elastic output actuator and a stiffness-adjusting actuator. The elastic output actuator is mainly responsible for the output of the joint torque. The stiffness-adjusting actuator is mainly responsible for adjusting the joint stiffness. These two mechanisms are analysed separately. Based on their relationship with the whole mechanism, a stiffness model of the entire knee joint is established. Experiments are subsequently conducted to evaluate the variable stiffness joint. The stiffness identification experiment indicates that the actual stiffness of the whole knee joint is essentially consistent with the theoretical value. The trajectory tracking experiment demonstrates that the joint exhibits excellent trajectory tracking capability, although stiffness has a certain effect. The exoskeleton assistive effect experiment demonstrates the ability of the exoskeleton to assist in standing. Additionally, the experiment on subjects with exoskeletons of different stiffnesses determines the impact of stiffness on human comfort.
We prove that determining the weak saturation number of a host graph $F$ with respect to a pattern graph $H$ is computationally hard, even when $H$ is the triangle. Our main tool establishes a connection between weak saturation and the shellability of simplicial complexes.
We define the category of polynomial functors by introducing its morphisms, called dependent lenses or lenses for short, and we show how they model interaction protocols. We introduce several methods for working with these lenses, including visual tools such as corolla forests and polybox pictures. We explain how these lenses represent bidirectional communication between polynomials and describe how they compose. By the end of the chapter, readers will have a comprehensive understanding of how polynomial functors and their morphisms can be used to model complex interactive behaviors.
We review relevant concepts from and properties of the categories of sets and of endofunctors on the category of sets relevant to our work. We discuss representable functors on the category of sets, introducing our exponential notation for them, and we state and prove the Yoneda lemma for these with the help of an exercise. We then examine sums (or coproducts) and products of sets and functions through the language of indexed families of sets. In particular, we characterize products of sets in terms of dependent functions, generalizing functions by allowing their codomains to vary depending on their inputs. We study nested sums and products of sets, explaining how distributivity allows us to expand products of sums of sets. By lifting all of this material to endofunctors on the category of sets, and using the fact that its limits and colimits are computed pointwise, we set ourselves up to introduce polynomial functors as sums of representable functors in the next chapter. Throughout the chapter, we emphasize key categorical principles and provide detailed explanations to ensure solid comprehension of these fundamental ideas.