To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Many mission-critical systems today have stringent timing requirements. Especially for cyber-physical systems (CPS) that directly interact with real-world entities, violating correct timing may cause accidents, damage or endanger life, property or the environment. To ensure the timely execution of time-sensitive software, a suitable system architecture is essential. This paper proposes a novel conceptual system architecture based on well-established technologies, including transition systems, process algebras, Petri Nets and time-triggered communications (TTC). This architecture for time-sensitive software execution is described as a conceptual model backed by an extensive list of references and opens up several additional research topics. This paper focuses on the conceptual level and defers implementation issues to further research and subsequent publications.
Thermal integrity profiling (TIP) is a nondestructive testing technique that takes advantage of the concrete heat of hydration (HoH) to detect inclusions during the casting process. This method is becoming more popular due to its ease of application, as it can be used to predict defects in most concrete foundation structures requiring only the monitoring of temperatures. Despite its advantages, challenges remain with regard to data interpretation and analysis, as temperature is only known at discrete points within a given cross-section. This study introduces a novel method for the interpretation of TIP readings using neural networks. Training data are obtained through numerical finite element simulation spanning an extensive range of soil, concrete, and geometrical parameters. The developed algorithm first classifies concrete piles, establishing the presence or absence of defects. This is followed by a regression algorithm that predicts the defect size and its location within the cross-section. In addition, the regression model provides reliable estimates for the reinforcement cage misalignment and concrete hydration parameters. To make these predictions, the proposed methodology only requires temperature data in the form standard in TIP, so it can be seamlessly incorporated within the TIP workflows. This work demonstrates the applicability and robustness of machine learning algorithms in enhancing nondestructive TIP testing of concrete foundations, thereby improving the safety and efficiency of civil engineering projects.
With the rapid advancements in robotics and autonomous driving, SLAM (simultaneous localization and mapping) has become a crucial technology for real-time localization and map creation, seeing widespread application across various domains. However, SLAM’s performance in dynamic environments is often compromised due to the presence of moving objects, which can introduce errors and inconsistencies in localization and mapping. To overcome these challenges, this paper presents a visual SLAM system that employs dynamic feature point rejection. The system leverages a lightweight YOLOv7 model for detecting dynamic objects and performing semantic segmentation. Additionally, it incorporates optical flow tracking and multiview geometry techniques to identify and eliminate dynamic feature points. This approach effectively mitigates the impact of dynamic objects on the SLAM process, while maintaining the integrity of static feature points, ultimately enhancing the system’s robustness and accuracy in dynamic environments. Finally, we evaluate our method on the TUM RGB-D dataset and in real-world scenarios. The experimental results demonstrate that our approach significantly reduces both the root mean square error (RMSE) and standard deviation (Std) compared to the ORB-SLAM2 algorithm.
The assessment of soil–structure interaction (SSI) under dynamic loading conditions remains a challenging task due to the complexities of modeling this system and the interplay of SSI effects, which is also characterized by uncertainties across varying loading scenarios. This field of research encompasses a wide range of engineering structures, including underground tunnels. In this study, a surrogate model based on a regression ensemble model has been developed for real-time assessment of underground tunnels under dynamic loads. The surrogate model utilizes synthetic data generated using Latin hypercube sampling, significantly reducing the required dataset size while maintaining accuracy. The synthetic dataset is constructed using an accurate numerical model that integrates the two-and-a-half-dimensional singular boundary method for modeling wave propagation in the soil with the finite element method for structural modeling. This hybrid approach allows for a precise representation of the dynamic interaction between tunnels and the surrounding soil. The validation and optimization algorithms are evaluated for two problems: underground railway tunnels with circular and rectangular cross-sections, both embedded in a homogenous full-space medium. Both geometrical and material characteristics of the underground tunnel are incorporated into the optimization process. The optimization target is to minimize elastic wave propagation in the surrounding soil. The results demonstrate that the proposed optimization framework, which combines the Bayesian optimization algorithm with surrogate models, effectively explores trade-offs among multiple design parameters. This enables the design of underground railway tunnels that achieve an optimal balance between elastic wave propagation performance, material properties, and geometric constraints.
The coupling of the disruptive processes of digitalization and the green transformation in a so-called “Twin Transformation” is already being considered a strategic step within the European Union and is discussed in the academic sphere. Strategically, this coupling is necessary and meaningful to realize synergies and to avoid counterproductive effects, such as rebound effects or lock-in effects, particularly given the time constraints imposed by climate change. The European data strategy not only calls for the establishment of various data spaces, such as the data space for the European Green New Deal, but also calls for the opening, integration, and utilization of European data for stakeholders from administration, business, and civil society. Considering this, it is argued that administrative informatics as a discipline could be integrated as an additional analytical perspective into the political science heuristic of the policy cycle. This integration offers substantial added value for analyzing and shaping the policy processes of the European Green transformation. Moreover, this heuristic approach enables the ex-ante prediction of changes in policymaking based on the theories, models, methods, and application areas of administrative informatics. Building on this premise, this article provides insights into the application of the proposed heuristic using the example of the European Green transformation. It analyzes the resulting implications for the analysis of policymaking considering an increasingly digitalized public administration.
We initiate a study of large deviations for block model random graphs in the dense regime. Following [14], we establish an LDP for dense block models, viewed as random graphons. As an application of our result, we study upper tail large deviations for homomorphism densities of regular graphs. We identify the existence of a ‘symmetric’ phase, where the graph, conditioned on the rare event, looks like a block model with the same block sizes as the generating graphon. In specific examples, we also identify the existence of a ‘symmetry breaking’ regime, where the conditional structure is not a block model with compatible dimensions. This identifies a ‘reentrant phase transition’ phenomenon for this problem – analogous to one established for Erdős–Rényi random graphs [13, 14]. Finally, extending the analysis of [34], we identify the precise boundary between the symmetry and symmetry breaking regimes for homomorphism densities of regular graphs and the operator norm on Erdős–Rényi bipartite graphs.
A graph $H$ is said to be common if the number of monochromatic labelled copies of $H$ in a red/blue edge colouring of a large complete graph is asymptotically minimised by a random colouring in which each edge is equally likely to be red or blue. We extend this notion to an off-diagonal setting. That is, we define a pair $(H_1,H_2)$ of graphs to be $(p,1-p)$-common if a particular linear combination of the density of $H_1$ in red and $H_2$ in blue is asymptotically minimised by a random colouring in which each edge is coloured red with probability $p$ and blue with probability $1-p$. Our results include off-diagonal extensions of several standard theorems on common graphs and novel results for common pairs of graphs with no natural analogue in the classical setting.
Global food security worsened during the COVID-19 pandemic. In Nigeria, food security indicators increased in the first months of the pandemic and then decreased slightly but never returned to their pre-pandemic levels. We assess if savings groups provided household coping mechanisms during COVID-19 in Nigeria by combining the in-person LSMS-ISA/GHS-2018/19 with four rounds of the Nigerian Longitudinal Phone Survey collected during the first year of the pandemic. A quasi-difference-in-differences analysis setup leveraging the panel nature of the data indicates that savings group membership reduces the likelihood of skipping a meal but finds no statistically significant effect on the likelihood of running out of food or eating fewer kinds of food. Given theoretical priors and other literature positing a relationship, we also implement an OLS regression analysis controlling for baseline values finding that having at least one female household member in a savings group is associated with a 5–15% reduction in the likelihood of reporting skipping meals, running out of food, and eating fewer kinds of food. This analysis is not able to establish causality, however, and may in fact overestimate the effects. Together, the results indicate that savings group membership is positively associated with food security after COVID-19, but the causal effect is statistically significant for only one of the three food security indicators. To conclude, considering the interest in savings groups and expectations of continued food security shocks, the importance of collecting better gender-disaggregated longitudinal household data combined with experimental designs and institutional data on savings groups.
Morphological matrices (MMs) have traditionally been used to generate concepts by combining different means. However, exploring the vast design space resulting from the combinatorial explosion of large MMs is challenging. Additionally, all alternative means are not necessarily compatible with each other. At the same time, for a system to achieve long-term success, it is necessary for it to be flexible such that it can easily be changed. Attaining high system flexibility necessitates an elevated compatibility with alternative means of achieving system functions, which further complicates the design space exploration process. To that end, we present an approach that we refer to as multi-objective technology assortment combinatorics. It uses a shortest-path algorithm to rapidly converge to a set of promising design candidates. While this approach can take flexibility into account, it can also consider other quantifiable objectives such as the cost and performance of the system. The efficiency of this approach is demonstrated with a case study from the automotive industry.
During the past few decades, the gradual merger of Discrete Geometry and the newer discipline of Computational Geometry has provided enormous impetus to mathematicians and computer scientists interested in geometric problems. This 2005 volume, which contains 32 papers on a broad range of topics of interest in the field, is an outgrowth of that synergism. It includes surveys and research articles exploring geometric arrangements, polytopes, packing, covering, discrete convexity, geometric algorithms and their complexity, and the combinatorial complexity of geometric objects, particularly in low dimension. There are points of contact with many applied areas such as mathematical programming, visibility problems, kinetic data structures, and biochemistry, as well as with algebraic topology, geometric probability, real algebraic geometry, and combinatorics.
Participatory Design – an iterative, flexible design process that closely involves stakeholders, often end users – is growing in use across design disciplines. As more practitioners use Participatory Design (PD), it has become less rigidly defined, with stakeholders engaged to varying degrees through disjointed techniques. This ambiguity can be counterproductive when discussing PD processes. We performed a systematic literature review that builds shared, foundational knowledge of PD processes and techniques while also summarizing the state of PD research in the field, as a first step in supporting richer understandings of how best to equitably engage with stakeholders. We found that a majority of PD literature examined specific case studies of PD, with the design of intangible systems representing the most common design context. Stakeholders most often participated throughout multiple stages of a design process, recruited in a variety of ways, and engaged in several of the 14 specific participatory techniques identified. Our findings also identify leverage points for creators of PD processes and how the leverage points impact design equity, including: (1) emergent versus predetermined processes; (2) direct versus indirect participation; (3) early versus late participation; (4) one time versus iterative participation; and (5) singular versus multiple PD techniques.
Heating, Ventilation, and Air Conditioning (HVAC) systems are major energy consumers in buildings, challenging the balance between efficiency and occupant comfort. While prior research explored generative AI for HVAC control in simulations, real-world validation remained scarce. This study addresses this gap by designing, deploying, and evaluating “Office-in-the-Loop,” a novel cyber-physical system leveraging generative AI within an operational office setting. Capitalizing on multimodal foundation models and Agentic AI, our system integrates real-time environmental sensor data (temperature, occupancy, etc.), occupants’ subjective thermal comfort feedback, and historical context as input prompts for the generative AI to dynamically predict optimal HVAC temperature setpoints. Extensive real-world experiments demonstrate significant energy savings (up to 47.92%) while simultaneously improving comfort (up to 26.36%) compared to baseline operation. Regression analysis confirmed the robustness of our approach against confounding variables like outdoor conditions and occupancy levels. Furthermore, we introduce Data-Driven Reasoning using Agentic AI, finding that prompting the AI for data-grounded rationales significantly enhances prediction stability and enables the inference of system dynamics and cost functions, bypassing the need for traditional reinforcement learning paradigms. This work bridges simulation and reality, showcasing generative AI’s potential for efficient, comfortable building environments and indicating future scalability to large systems like data centers.
In this editorial, we draw insights from a special collection of peer-reviewed papers investigating how new data sources and technology can enhance peace. The collection examines local and global practices that strive towards positive peace through the responsible use of frontier technologies. In particular, the articles of the collection illustrate how advanced techniques—including machine learning, network analysis, specialised text classifiers, and large-scale predictive analytics—can deepen our understanding of conflict dynamics by revealing subtle interdependencies and patterns. Others assess innovative approaches reinterpreting peace as a relational phenomenon. Collectively, they assess ethical, technical, and governance challenges while advocating balanced frameworks that ensure accountability alongside innovation. The collection offers a practical roadmap for integrating technical tools into peacebuilding to foster resilient societies and non-violent conflict transformations.
Indoor positioning systems (IPS) are essential for mobile robot navigation in environments where global positioning systems (GPS) are unavailable, such as hospitals, warehouses, and intelligent infrastructure. While current surveys may limit themselves to specific technologies or fail to provide practical application-specific details, this review summarizes IPS developments directed specifically towards mobile robotics. It examines and compares a breadth of approaches that vary across non-radio frequency, radio frequency, and hybrid sensor fusion systems, through the lens of performance metrics that include accuracy, delay, scalability, and cost. Distinctively, this work explores emerging innovations, including synthetic aperture radar (SAR), federated learning, and privacy-aware AI, which are reshaping the IPS landscape. The motivation stems from the’ increasing complexity and dynamic nature of indoor environments, where high-precision, real-time localization is essential for safety and efficiency. This literature review provides a new conceptual, cross-border pathway for research and implementation of IPS in mobile robotics, addressing both technical and application-related challenges in sectors related to healthcare, industry, and smart cities. The findings from the literature review allow early career researchers, industry knowledge workers, and stakeholders to provide secure societal, human, and economic integration of IPS with AI and IoT in safe expansions and scale-ups.
This study presents an innovative framework to improve the accessibility and usability of collaborative robot programming. Building on previous research that evaluated the feasibility of using a domain-specific language based on behaviour-driven development, this paper addresses the limitations of earlier work by integrating additional features like a drag-and-drop Blockly web interface. The system enables end users to define and execute robot actions with minimal technical knowledge, making it more adaptable and intuitive. Additionally, a gesture-recognition module facilitates multimodal interaction, allowing users to control robots through natural gestures. The system was evaluated through a user study involving participants with varying levels of professional experience and little to no programming background. Results indicate significant improvements in user satisfaction, with the system usability scale overall score increasing from 7.50 to 8.67 out of a maximum of 10 and integration ratings rising from 4.42 to 4.58 out of 5. Participants completed tasks using a manageable number of blocks (5 to 8) and reported low frustration levels (mean: 8.75 out of 100) alongside moderate mental demand (mean: 38.33 out of 100). These findings demonstrate the tool’s effectiveness in reducing cognitive load, enhancing user engagement and supporting intuitive, efficient programming of collaborative robots for industrial applications.
The chapter examines the motivational dWPHP problem from three perspectives: logical (axiomatization and provability), computational complexity (witnessing) and proof complexity (propositional translation). It also defines strong proof systems and formulates some of their properties.