To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We investigate a Leslie-type prey–predator system with an Allee effect to understand the dynamics of populations under stress. First, we determine stability conditions and conduct a Hopf bifurcation analysis using the Allee constant as a bifurcation parameter. At low densities, we observe that a weak Allee effect induces a supercritical Hopf bifurcation, while a strong effect leads to a subcritical one. Notably, a stability switch occurs, and the system exhibits multiple Hopf bifurcations as the Allee effect varies. Subsequently, we perform a sensitivity analysis to assess the robustness of the model to parameter variations. Additionally, together with the numerical examples, the FAST (Fourier amplitude sensitivity test) approach is employed to examine the sensitivity of the prey–predator system to all parameter values. This approach identifies the most influential factors among the input parameters on the output variable and evaluates the impact of single-parameter changes on the dynamics of the system. The combination of detailed bifurcation and sensitivity analysis bridges the gap between theoretical ecology and practical applications. Furthermore, the results underscore the importance of the Allee effect in maintaining the delicate balance between prey and predator populations and emphasize the necessity of considering complex ecological interactions to accurately model and understand these systems.
The aerodynamic performance of an ultra-high aspect ratio strut-braced wing design is assessed for flight at cruise. The sensitivity of a selected airframe design from a recent CleanSky2 project to operating conditions around the design point is quantified using the adaptive-cut high-dimensional model representation (HDMR) method, which allows for the decomposition of the parameter space into smaller subdomains to isolate the parameter interactions and influence on the aerodynamic forces. A comparative analysis with a cantilever wing configuration is performed to identify the role of the strut on the sensitivity of the design. Insight into the transonic performance is gained by characterisation of buffet limits and drag rise. Results show that, for the selected optimised airframe configuration, small changes in freestream parameters can lead to significant reduction in performance due to drag divergence triggered by the shock wave generated at the strut-wing junction and at the fuselage-strut intersection. Cruise conditions can be achieved without buffet onset throughout much of the parameter space. Safety margins associated with buffeting are satisfied, but sensible limits are imposed on the flight envelope for this configuration.
Network meta-analysis (NMA) enables simultaneous assessment of multiple treatments by combining both direct and indirect evidence. While NMAs are increasingly important in healthcare decision-making, challenges remain due to limited direct comparisons between treatments. This data sparsity complicates the accurate estimation of correlations among treatments in arm-based NMA (AB-NMA). To address these challenges, we introduce a novel sensitivity analysis tool tailored for AB-NMA. This study pioneers a tipping point analysis within a Bayesian framework, specifically targeting correlation parameters to assess their influence on the robustness of conclusions about relative treatment effects. The analysis explores changes in the conclusion based on whether the 95% credible interval includes the null value (referred to as the interval conclusion) and the magnitude of point estimates. Applying this approach to multiple NMA datasets, including 112 treatment pairs, we identified tipping points in 13 pairs (11.6%) for interval conclusion change and in 29 pairs (25.9%) for magnitude change with a threshold at 15%. These findings underscore potential commonality in tipping points and emphasize the importance of our proposed analysis, especially in networks with sparse direct comparisons or wide credible intervals for correlation estimates. A case study provides a visual illustration and interpretation of the tipping point analysis. We recommend integrating this tipping point analysis as a standard practice in AB-NMA.
Deep geological repositories are critical for the long-term storage of hazardous materials, where understanding the mechanical behavior of emplacement drifts is essential for safety assurance. This study presents a surrogate modeling approach for the mechanical response of emplacement drifts in rock salt formations, utilizing Gaussian processes (GPs). The surrogate model serves as an efficient substitute for high-fidelity mechanical simulations in many-query scenarios, including time-dependent sensitivity analyses and calibration tasks. By significantly reducing computational demands, this approach facilitates faster design iterations and enhances the interpretation of monitoring data. The findings indicate that only a few key parameters are sufficient to accurately reflect in-situ conditions in complex rock salt models. Identifying these parameters is crucial for ensuring the reliability and safety of deep geological disposal systems.
In prognosis studies with time-to-event outcomes, the survivals of groups with high/low biomarker expression are often estimated by the Kaplan–Meier method, and the difference between groups is measured by the hazard ratios (HRs). Since the high/low expressions are usually determined by study-specific cutoff values, synthesizing only HRs for summarizing the prognostic capacity of a biomarker brings heterogeneity in the meta-analysis. The time-dependent summary receiver operating characteristics (SROC) curve was proposed as a cutoff-free summary of the prognostic capacity, extended from the SROC curve in meta-analysis of diagnostic studies. However, estimates of the time-dependent SROC curve may be threatened by reporting bias in that studies with significant outcomes, such as HRs, are more likely to be published and selected in meta-analyses. Under this conjecture, this paper proposes a sensitivity analysis method for quantifying and adjusting reporting bias on the time-dependent SROC curve. We model the publication process determined by the significance of the HRs and introduce a sensitivity analysis method based on the conditional likelihood constrained by some expected proportions of published studies. Simulation studies showed that the proposed method could reduce reporting bias given the correctly-specified marginal selection probability. The proposed method is illustrated on the real-world meta-analysis of Ki67 for breast cancer.
Conditioning on variables affected by treatment can induce post-treatment bias when estimating causal effects. Although this suggests that researchers should measure potential moderators before administering the treatment in an experiment, doing so may also bias causal effect estimation if the covariate measurement primes respondents to react differently to the treatment. This paper formally analyzes this trade-off between post-treatment and priming biases in three experimental designs that vary when moderators are measured: pre-treatment, post-treatment, or a randomized choice between the two. We derive nonparametric bounds for interactions between the treatment and the moderator under each design and show how to use substantive assumptions to narrow these bounds. These bounds allow researchers to assess the sensitivity of their empirical findings to priming and post-treatment bias. We then apply the proposed methodology to a survey experiment on electoral messaging.
Why do different models give different results? Which modeling assumptions matter most? These are questions of model influence. Standard regression results fail to address simple questions like, which control variables are important for getting this result? In this chapter we lay out a framework for thinking about influence and draw on empirical examples to illustrate. When a result is not fully robust, the influence analysis provides methodological explanations for the failure of robustness. These explanations can be considered methodological scope conditions – they explain why a hypothesis can be supported in some cases but not in others. We also show how multiverse results can help inform the method of sensitivity analysis
In restricted statistical models, since the first derivatives of the likelihood displacement are often nonzero, the commonly adopted formulation for local influence analysis is not appropriate. However, there are two kinds of model restrictions in which the first derivatives of the likelihood displacement are still zero. General formulas for assessing local influence under these restrictions are derived and applied to factor analysis as the usually used restriction in factor analysis satisfies the conditions. Various influence schemes are introduced and a comparison to the influence function approach is discussed. It is also shown that local influence for factor analysis is invariant to the scale of the data and is independent of the rotation of the factor loadings.
This paper focuses on analyzing data collected in situations where investigators use multiple discrete indicators as surrogates, for example, a set of questionnaires. A very flexible latent class model is used for analysis. We propose a Bayesian framework to perform the joint estimation of the number of latent classes and model parameters. The proposed approach applies the reversible jump Markov chain Monte Carlo to analyze finite mixtures of multivariate multinomial distributions. In the paper, we also develop a procedure for the unique labeling of the classes. We have carried out a detailed sensitivity analysis for various hyperparameter specifications, which leads us to make standard default recommendations for the choice of priors. The usefulness of the proposed method is demonstrated through computer simulations and a study on subtypes of schizophrenia using the Positive and Negative Syndrome Scale (PANSS).
Considering that causal mechanisms unfold over time, it is important to investigate the mechanisms over time, taking into account the time-varying features of treatments and mediators. However, identification of the average causal mediation effect in the presence of time-varying treatments and mediators is often complicated by time-varying confounding. This article aims to provide a novel approach to uncovering causal mechanisms in time-varying treatments and mediators in the presence of time-varying confounding. We provide different strategies for identification and sensitivity analysis under homogeneous and heterogeneous effects. Homogeneous effects are those in which each individual experiences the same effect, and heterogeneous effects are those in which the effects vary over individuals. Most importantly, we provide an alternative definition of average causal mediation effects that evaluates a partial mediation effect; the effect that is mediated by paths other than through an intermediate confounding variable. We argue that this alternative definition allows us to better assess at least a part of the mediated effect and provides meaningful and unique interpretations. A case study using ECLS-K data that evaluates kindergarten retention policy is offered to illustrate our proposed approach.
Influence curves of some parameters under various methods of factor analysis have been given in the literature. These influence curves depend on the influence curves for either the covariance or the correlation matrix used in the analysis. The differences between the influence curves based on the covariance and the correlation matrices are derived in this paper. Simple formulas for the differences of the influence curves, based on the two matrices, for the unique variance matrix, factor loadings and some other parameter are obtained under scale-invariant estimation methods, though the influence curves themselves are in complex forms.
This research concerns a mediation model, where the mediator model is linear and the outcome model is also linear but with a treatment–mediator interaction term and a residual correlated with the residual of the mediator model. Assuming the treatment is randomly assigned, parameters in this mediation model are shown to be partially identifiable. Under the normality assumption on the residual of the mediator and the residual of the outcome, explicit full-information maximum likelihood estimates of model parameters are introduced given the correlation between the residual for the mediator and the residual for the outcome. A consistent variance matrix of these estimates is derived. Currently, the coefficients of this mediation model are estimated using the iterative feasible generalized least squares (IFGLS) method that is originally developed for seemingly unrelated regressions (SURs). We argue that this mediation model is not a system of SURs. While the IFGLS estimates are consistent, their variance matrix is not. Theoretical comparisons of the FIMLE variance matrix and the IFGLS variance matrix are conducted. Our results are demonstrated by simulation studies and an empirical study. The FIMLE method has been implemented in a freely available R package iMediate.
Many psychological concepts are unobserved and usually represented as latent factors apprehended through multiple observed indicators. When multiple-subject multivariate time series data are available, dynamic factor analysis models with random effects offer one way of modeling patterns of within- and between-person variations by combining factor analysis and time series analysis at the factor level. Using the Dirichlet process (DP) as a nonparametric prior for individual-specific time series parameters further allows the distributional forms of these parameters to deviate from commonly imposed (e.g., normal or other symmetric) functional forms, arising as a result of these parameters’ restricted ranges. Given the complexity of such models, a thorough sensitivity analysis is critical but computationally prohibitive. We propose a Bayesian local influence method that allows for simultaneous sensitivity analysis of multiple modeling components within a single fitting of the model of choice. Five illustrations and an empirical example are provided to demonstrate the utility of the proposed approach in facilitating the detection of outlying cases and common sources of misspecification in dynamic factor analysis models, as well as identification of modeling components that are sensitive to changes in the DP prior specification.
In patients with treatment resistant depression (TRD), the ESCAPE-TRD study showed esketamine nasal spray was superior to quetiapine extended release.
Aims
To determine the robustness of the ESCAPE-TRD results and confirm the superiority of esketamine nasal spray over quetiapine extended release.
Method
ESCAPE-TRD was a randomised, open-label, rater-blinded, active-controlled phase IIIb trial. Patients had TRD (i.e. non-response to two or more antidepressive treatments within a major depressive episode). Patients were randomised 1:1 to flexibly dosed esketamine nasal spray or quetiapine extended release, while continuing an ongoing selective serotonin reuptake inhibitor/serotonin norepinephrine reuptake inhibitor. The primary end-point was achieving a Montgomery–Åsberg Depression Rating Scale score of ≤10 at Week 8, while the key secondary end-point was remaining relapse free through Week 32 after achieving remission at Week 8. Sensitivity analyses were performed on these end-points by varying the definition of remission based on timepoint, threshold and scale.
Results
Of 676 patients, 336 were randomised to esketamine nasal spray and 340 to quetiapine extended release. All sensitivity analyses on the primary and key secondary end-point favoured esketamine nasal spray over quetiapine extended release, with relative risks ranging from 1.462 to 1.737 and from 1.417 to 1.838, respectively (all p < 0.05). Treatment with esketamine nasal spray shortened time to first and confirmed remission (hazard ratio: 1.711 [95% confidence interval 1.402, 2.087], p < 0.001; 1.658 [1.337, 2.055], p < 0.001).
Conclusion
Esketamine nasal spray consistently demonstrated significant superiority over quetiapine extended release using all pre-specified definitions for remission and relapse. Sensitivity analyses supported the conclusions of the primary ESCAPE-TRD analysis and demonstrated robustness of the results.
This Element works as non-technical overview of Agent-Based Modelling (ABM), a methodology which can be applied to economics, as well as fields of natural and social sciences. This Element presents the introductory notions and historical background of ABM, as well as a general overview of the tools and characteristics of this kind of models, with particular focus on more advanced topics like validation and sensitivity analysis. Agent-based simulations are an increasingly popular methodology which fits well with the purpose of studying problems of computational complexity in systems populated by heterogeneous interacting agents.
Carefully designing blade geometric parameters is necessary as they determine the aerodynamic performance of a rotor. However, manufacturing inaccuracies cause the blade geometric parameters to deviate randomly from the ideal design. Therefore, it is essential to quantify uncertainty and analyse the sensitivity of the blade geometric deviations on the compressor performance. This work considers a subsonic compressor rotor stage and examines samples with different geometry features using three-dimensional Reynolds-averaged Navier-Stokes simulations. A method to combine Halton sequence and non-intrusive polynomial chaos is adopted to perform the uncertainty quantitative (UQ) analysis. The Sobol’ index and Spearman correlation coefficient help analyse the sensitivity and correlation between the compressor performance and blade geometric deviations, respectively. The results show that the fluctuation amplitude of the compressor performance decreases for lower mass flow rates, and the sensitivity of the compressor performance to the blade geometrical parameters varies with the working conditions. The effects of various blade geometric deviations on the compressor performance are independent and linearly superimposed, and the combined effects of different geometric deviations on the compressor performance are small.
Open rotors can play a critical role towards transitioning to a more sustainable aviation by providing a fuel-efficient alternative. This paper considers the sensitivity of an open-rotor engine to variations of three operational parameters during take-off, focusing on both aerodynamics and aeroacoustics. Via a sensitivity analysis, insights to the complex interactions of aerodynamics and aeroacoustics can be gained. For both the aerodynamics and aeroacoustics of the engine, numerical methods have been implemented. Namely, the flowfield has been solved using unsteady Reynolds Averaged Navier Stokes and the acoustic footprint of the engine has been quantified through the Ffowcs Williams-Hawking equations. The analysis has concluded that the aerodynamic performance of the open rotor can decisively be impacted by small variations of the operational parameters. Specifically, blade loading increased by 9.8% for a 5% decrease in inlet total temperature with the uncertainty being amplified through the engine. In comparison, the aeroacoustic footprint of the engine had more moderate variations, with the overall sound pressure level increasing by up to 2.4dB for a microphone lying on the engine axis and aft of the inlet. The results signify that there is considerable sensitivity in the model and shall be systematically examined during the design or optimisation process.
This chapter applies the total error framework presented in Chapter 5 to a case example of preelection polling during the 2016 US presidential election. Here, the focus is on problems with a single poll.
The United States Congress passed the 21st Century Cures Act mandating the development of Food and Drug Administration guidance on regulatory use of real-world evidence. The Forum on the Integration of Observational and Randomized Data conducted a meeting with various stakeholder groups to build consensus around best practices for the use of real-world data (RWD) to support regulatory science. Our companion paper describes in detail the context and discussion of the meeting, which includes a recommendation to use a causal roadmap for study designs using RWD. This article discusses one step of the roadmap: the specification of a sensitivity analysis for testing robustness to violations of causal model assumptions.
Methods:
We present an example of a sensitivity analysis from a RWD study on the effectiveness of Nifurtimox in treating Chagas disease, and an overview of various methods, emphasizing practical considerations on their use for regulatory purposes.
Results:
Sensitivity analyses must be accompanied by careful design of other aspects of the causal roadmap. Their prespecification is crucial to avoid wrong conclusions due to researcher degrees of freedom. Sensitivity analysis methods require auxiliary information to produce meaningful conclusions; it is important that they have at least two properties: the validity of the conclusions does not rely on unverifiable assumptions, and the auxiliary information required by the method is learnable from the corpus of current scientific knowledge.
Conclusions:
Prespecified and assumption-lean sensitivity analyses are a crucial tool that can strengthen the validity and trustworthiness of effectiveness conclusions for regulatory science.