To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Recent academic contributions explore the integration of Digital Twins (DTs) within smart Product-Service System (sPSS). This integration aims to innovate business propositions, hardware and services. However, gaps persist in developing DT environments to support early-stage collaborative innovation for sPSS, and limited studies explore how real-time synchronized digital replicas enhance value co-creation in this area. This paper addresses this gap by presenting a framework and practical example of integrating value-driven decision support into early sPSS conceptual design. A case study on the development of the Smart Electric Vehicle (SEV) conducted with a global automotive Original Equipment Manufacturer (OEM) demonstrates the framework’s efficacy. Through qualitative data analyses based on experimental validation in a case company, the DT proves effective in aiding decision makers in selecting value-adding configurations within specific scenarios. Furthermore, the DT serves as a visual decision-making tool, fostering collaboration across diverse teams within the automotive company. This collaboration facilitates value creation across practitioners with varied backgrounds, emphasizing the DT’s role in enhancing early-stage innovation and co-creation processes in the sPSS domain.
Often in Software Engineering, a modeling formalism has to support scenarios of inconsistency in which several requirements either reinforce or contradict each other. Paraconsistent transition systems are proposed in this paper as one such formalism: states evolve through two accessibility relations capturing weighted evidence of a transition or its absence, respectively. Their weights come, parametrically, from a residuated lattice. This paper explores both i) a category of these systems, and the corresponding compositional operators and ii) a modal logic to reason upon them. Furthermore, two notions of crisp and graded simulation and bisimulation are introduced in order to relate two paraconsistent transition systems. Finally, results of modal invariance, for specific subsets of formulas, are discussed.
In this chapter, we describe how to jointly model continuous quantities, by representing them as multiple continuous random variables within the same probability space. We define the joint cumulative distribution function and the joint probability density function and explain how to estimate the latter from data using a multivariate generalization of kernel density estimation. Next, we introduce marginal and conditional distributions of continuous variables and also discuss independence and conditional independence. Throughout, we model real-world temperature data as a running example. Then, we explain how to jointly simulate multiple random variables, in order to correctly account for the dependence between them. Finally, we define Gaussian random vectors which are the most popular multidimensional parametric model for continuous data, and apply them to model anthropometric data.
The European Resuscitation Council (ERC) establishes guidelines for cardiopulmonary resuscitation (CPR) under standard conditions and special circumstances but without specific instructions for nighttime situations with reduced visibility. The aim of this study was to evaluate the feasibility of performing CPR at night under two different conditions, in darkness with ambient light and with the additional illumination of a headlamp, as well as to determine the quality of the maneuver.
Methods:
A crossover, randomized pilot study involving nineteen lifeguards was conducted, with each participant performing two five-minute CPR tests: complete darkness with headlamp and natural night environment at the beach without additional lighting. Both tests were conducted with a 30:2 ratio of chest compression (CC) to ventilations using mouth-to-pocket mask technique in the darkness of the night with a 30-minute break between them. Outcome measures included quality of CPR, number of CCs, mean depth of CCs, mean rate of CCs, and number of effective ventilations. Results were reported as the mean or median difference (MD) between the two groups with 95% confidence interval (CI) using techniques for paired data.
Results:
There were no statistically significant differences between the two lighting conditions for the outcomes of CPR quality, mean depth of CCs, or number of effective ventilations. The number of CCs was lower when performed without the headlamp (MD: -8; 95%CI, -15 to 0). In addition, the mean rate of CCs was lower when performed without the headlamp (MD: -3; 95%CI, -5 to -1).
Conclusions:
The rescuers performed CPR at night with good quality, both in darkness and with the illumination of a headlamp. The use of additional lighting with a headlamp does not appear to be essential for conducting resuscitation.
The ARLE GPS tool provides computer-aided design support for solving problems with the spatial planning and design of houses, using a robust design model with physical-biological and cost strategies. This enables architects to eliminate uncertainties and to make robust decisions by applying computational thinking to decision making and action implementation. This support enables the architect to deal with the complexity arising from the interrelationships between the design variables and transforms the spatial planning problem, which is conceptualized as illdefined, into a well-defined problem. A scientific method is used, based on mathematical modeling of the action-decision field of design geometric variables, rather than a drawn method involving sketches. This tool acts as an aid mechanism, an assembler, a simulator, and an evaluator of geometric prototypes (virtual or graphical) and can be used to systematize the assembly or modeling of the FPL structure, particularly with respect to the performance required of a house. This candidate solution, provided by the tool, defines the spatial dimensions of the rooms in the house, the topological data of the assembly sequence, and the connections between rooms. The architect converts this virtual prototype into a graphical FPL prototype, which is then modeled, refined and evaluated continuously and objectively with the aid of ARLE GPS until a solution is obtained that satisfies the requirements, constraints and objectives of the problem. In this way, a solution to the problem (i.e., the project) can be captured and generated.
This chapter covers applications of quantum computing in the area of nuclear and particle physics. We cover algorithms for simulating quantum field theories, where end-to-end problems include computing fundamental physical quantities and scattering cross sections. We also discuss simulations of nuclear physics, which encompasses individual nuclei as well as dense nucleonic matter such as neutron stars.
This chapter covers applications of quantum computing in the area of quantum chemistry, where the goal is to predict the physical properties and behaviors of atoms, molecules, and materials. We discuss algorithms for simulating electrons in molecules and materials, including both static properties such as ground state energies and dynamic properties. We also discuss algorithms for simulating static and dynamic aspects of vibrations in molecules and materials.
To enhance radiological and nuclear emergency preparedness of hospitals while responding to the refugee crisis, the Government of the Republic of Moldova implemented an innovative approach supported by the World Health Organization (WHO). This initiative featured a comprehensive package that integrated health system assessment, analysis of existing plans and procedures, and novel medical training component. The training, based on relevant WHO and International Atomic Energy Agency (IAEA) guidance, combined theory with contemporary adult learning solutions, such as practical skill stations, case reviews, and clinical simulation exercises.
This method allowed participants to identify and address gaps in their emergency response capacities, enhancing their ability to ensure medical management of radiological and nuclear events. This course is both innovative and adaptable, offering a potential model for other countries seeking to strengthen radiological and nuclear emergency response capabilities of the acute care clinical providers.
Due to the scarcity of data, the demographic regime of pre-plague England is poorly understood. In this article, we review the existing literature to estimate the mean age at first marriage for women (at 24) and men (at 27), the remaining life expectancy at first marriage for men (at 25 years), the mean household size (at 5.8), and marital fertility around 1300. Based on these values, we develop a macrosimulation that creates a consistent image of English demography at its medieval population peak that reflects a Western European marriage pattern with a relatively very high share of celibates.
This chapter surveys some of the many types of models used in science, and some of the many ways scientists use models. Of particular interest for our purposes are the relationships between models and other aspects of scientific inquiry, such as data, experiments, and theories. Our discussion shows important ways in which modeling can be thought of as a distinct and autonomous scientific activity, but always models can be crucial for making use of data and theories and for performing experiments. The growing reliance on simulation models has raised new and important questions about the kind of knowledge gained by simulations and the relationship between simulation and experimentation. Is it important to distinguish between simulation and experimentation, and if so, why?
The Monte Carlo method is a powerful approach providing a numerical estimate of an integral using random samples drawn from a given distribution and is perhaps the most widely used numerical method to estimate the price of complex derivative securities. This is because any integral can be written as an expectation of a function of a random variable following a given distribution, and this expectation can be estimated by averaging independent samples drawn from the corresponding distribution, as per the law of large numbers. This suggests that any integral, but also any expectation or probability, can be estimated arbitrarily well by averaging a sufficiently large number of random samples. Most computational software includes random number generators, drawing samples from the uniform distribution. Samples from other distributions can be easily obtained by transforming those uniform samples according to the probability integral transform seen in Chapter 2. The last section briefly introduces control variate and importance sampling, which are two improvements to the plain Monte Carlo method aiming to provide better estimates at fixed computational cost.
Mental imagery can be used to simulate imminent, distant possible, or even impossible futures. Such mental simulation enables people to explore the consequences of different actions they want to perform or the consequences of being in different kinds of situations. Predictive simulation retrieves embodied knowledge but also creates new knowledge because people can compare different simulated scenarios and draw conclusions from that.
This report describes the implementation and evaluation of a unique escape room game/unfolding public health preparedness simulation into nursing education. The innovative approach was designed to teach disease investigation, epidemiological principles, and technical skills such as the tuberculosis (TB) skin testing techniques.
Methods
The escape room/unfolding health preparedness simulation was implemented with 29 pre-licensure nursing students and involved game-like activities as well as a realistic disaster simulation scenario with standardized patients.
Results
The project yielded positive outcomes, with students demonstrating increased knowledge and confidence. Students also recommended the simulation for teaching disaster preparedness, highlighting its effectiveness. Evaluation data also suggested refinement of the simulation around the nurses’ roles.
Conclusions
While implementing this teaching innovation had challenges, the approach enhanced active learning, critical thinking, and teamwork in nursing education, preparing students for real-world health care challenges. The project underscores the importance of such simulations in training nursing students for public health emergencies. It also highlights the need for further research to assess long-term impacts on student outcomes, indicating the potential for continued improvement and development in the field.
The chapter outlines key principles in Cognitive CDA, which inherits its social theory from CDA and from cognitive linguistics inherits a particular view of language and a framework for analysing language (as well as other semiotic modes). In connection with CDA, the chapter describes the dialectical relationship conceived between discourse and society. Key concepts relating to the dialogicality of discourse are also introduced, namely intertextuality and interdiscursivity. The central role of discourse in maintaining power and inequality is described with a focus on the ideological and legitimating functions of language and conceptualisation. In connection with cognitive linguistics, the chapter describes the non-autonomous nature of language, the continuity between grammar and the lexicon and the experiential grounding of language. The key concept of construal and its implications for ideology in language and conceptualisation are discussed. A framework in which construal operations are related to discursive strategies and domain-general cognitive systems and processes is set out. The chapter closes by briefly introducing the main models and methods of Cognitive CDA.
Test educational interventions to increase the quality of care in telemedicine.
Background:
Telemedicine (TM) has become an essential tool to practise medicine around the world. However, education to address clinical skills in TM remains an area of need globally across the health professions. We aim to evaluate the impact of a pilot online learning platform (OLP) and standardized coaching programme on the quality of medical student TM clinical skills.
Methods:
A randomized pilot study was conducted with fourth-year medical students (n = 12). All participants engaged in video-recorded standardized patient (SP) simulated encounters to assess TM clinical skills before and after the intervention. Participants were randomized to either the OLP or OLP + Virtual Coaching Institute (VCI) intervention cohort. Quantitative and qualitative data were collected to address self-reported skills, attitudes, and self-efficacy before the 1st SP encounter and after the 2nd SP encounter. SP encounter recordings were scored by two blinded non-investigator raters based on a standardized rubric to measure the change in TM care delivered pre- and post-intervention. Statistical analysis of quantitative data included descriptive statistics and mixed effects ANOVA.
Findings:
Recruitment and retention of participants exceeded expectations, pointing to significant enthusiasm for this educational opportunity. Self-reported skills and scored simulation skills demonstrated significant improvements for all participants receiving the interventions. Both OLP and VCI interventions were well received, feasible, and demonstrated statistically significant efficacy in improving TM clinical skills. Participants who received coaching described more improvements in self-efficacy, confidence, and overall virtual clinical skills. This study provides evidence that virtualized clinical learning environments can positively impact the development of TM clinical skills among medical students. As TM continues to evolve, the implementation of innovative training approaches will be crucial in preparing the next generation of healthcare professionals for the demands of modern healthcare delivery.
This paper proposes an ordinal generalization of the hierarchical classes model originally proposed by De Boeck and Rosenberg (1998). Any hierarchical classes model implies a decomposition of a two-way two-mode binary array M into two component matrices, called bundle matrices, which represent the association relation and the set-theoretical relations among the elements of both modes in M. Whereas the original model restricts the bundle matrices to be binary, the ordinal hierarchical classes model assumes that the bundles are ordinal variables with a prespecified number of values. This generalization results in a classification model with classes ordered along ordinal dimensions. The ordinal hierarchical classes model is shown to subsume Coombs and Kao's (1955) model for nonmetric factor analysis. An algorithm is described to fit the model to a given data set and is subsequently evaluated in an extensive simulation study. An application of the model to student housing data is discussed.
The Vale–Maurelli (VM) approach to generating non-normal multivariate data involves the use of Fleishman polynomials applied to an underlying Gaussian random vector. This method has been extensively used in Monte Carlo studies during the last three decades to investigate the finite-sample performance of estimators under non-Gaussian conditions. The validity of conclusions drawn from these studies clearly depends on the range of distributions obtainable with the VM method. We deduce the distribution and the copula for a vector generated by a generalized VM transformation, and show that it is fundamentally linked to the underlying Gaussian distribution and copula. In the process we derive the distribution of the Fleishman polynomial in full generality. While data generated with the VM approach appears to be highly non-normal, its truly multivariate properties are close to the Gaussian case. A Monte Carlo study illustrates that generating data with a different copula than that implied by the VM approach severely weakens the performance of normal-theory based ML estimates.
Six different algorithms to generate widely different non-normal distributions are reviewed. These algorithms are compared in terms of speed, simplicity and generality of the technique. The advantages and disadvantages of using these algorithms are briefly discussed.
The use of p-values in combining the results of independent studies often involves studies that are potentially aberrant either in quality or in actual values. A robust data analysis suggests the use of a statistic that takes these aberrations into account by trimming some of the largest and smallest p-values. We present a trimmed statistic based on an inverse cumulative normal transformation of the ordered p-values, together with a simple and convenient method for approximating the distribution and first two moments of this statistic.
We give an account of Classical Test Theory (CTT) in terms of the more fundamental ideas of Item Response Theory (IRT). This approach views classical test theory as a very general version of IRT, and the commonly used IRT models as detailed elaborations of CTT for special purposes. We then use this approach to CTT to derive some general results regarding the prediction of the true-score of a test from an observed score on that test as well from an observed score on a different test. This leads us to a new view of linking tests that were not developed to be linked to each other. In addition we propose true-score prediction analogues of the Dorans and Holland measures of the population sensitivity of test linking functions. We illustrate the accuracy of the first-order theory using simulated data from the Rasch model, and illustrate the effect of population differences using a set of real data.