We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In the applications of maximum likelihood factor analysis the occurrence of boundary minima instead of proper minima is no exception at all. In the past the causes of such improper solutions could not be detected. This was impossible because the matrices containing the parameters of the factor analysis model were kept positive definite. By dropping these constraints, it becomes possible to distinguish between the different causes of improper solutions. In this paper some of the most important causes are discussed and illustrated by means of artificial and empirical data.
Turbulent flow is an important branch of fluid mechanics with wide-ranging occurrences and applications, from the formation of tropical cyclones to the stirring of a cup of coffee. Turbulence results in increased skin friction and heat transfer across surfaces, as well as enhanced mixing. As such, it is of practical significance, and there is a need to establish predictive methods to quantify turbulent flows. Equally important is a physical understanding of turbulent flows to guide strategies to model and control turbulence-driven phenomena. We focus on the study of turbulent flows and draw on theoretical developments, experimental measurements, and results from numerical simulations. Turbulent flows are governed by the Navier-Stokes equations. The solution of these equations for turbulent flows displays chaotic and multiscale behavior. When averaged, the nonlinear terms in the Navier-Stokes equations lead to the so-called closure problem, where additional unknowns are introduced in the mean flow equations. These unknowns are typically modeled using intuition, experience, and dimensional arguments. We present the scaling and dimensional analysis necessary for model development.
In 2008, Tóth and Vető defined the self-repelling random walk with directed edges as a non-Markovian random walk on $\unicode{x2124}$: in this model, the probability that the walk moves from a point of $\unicode{x2124}$ to a given neighbor depends on the number of previous crossings of the directed edge from the initial point to the target, called the local time of the edge. Tóth and Vető found that this model exhibited very peculiar behavior, as the process formed by the local times of all the edges, evaluated at a stopping time of a certain type and suitably renormalized, converges to a deterministic process, instead of a random one as in similar models. In this work, we study the fluctuations of the local times process around its deterministic limit, about which nothing was previously known. We prove that these fluctuations converge in the Skorokhod $M_1$ topology, as well as in the uniform topology away from the discontinuities of the limit, but not in the most classical Skorokhod topology. We also prove the convergence of the fluctuations of the aforementioned stopping times.
Levodopa-carbidopa intestinal gel (LCIG) therapy has been shown to be a safe and effective treatment for advanced Parkinson’s disease (PD). Limited data are available regarding long-term benefits and complications in Canada. Objective of the study was to review long-term experience and clinical outcomes in PD patients with LCIG therapy over 11 years in a multidisciplinary University clinic setting.
Methods:
Chart review was done on PD patients with LCIG from 2011 to 2022. Data collected: dosing, UPDRS-III motor scores, OFF times, hours with dyskinesias, MoCA, complications, discontinuation reasons, and nursing time requirements.
Results:
Thirty-three patients received LCIG therapy with a mean follow-up of 3.25±2.09 years. UPDRS-III scores showed reduction of 15% from baseline (mean 35.9) up to 4 years (mean 30.4). Daily OFF time improved from baseline (mean 7.1 ± 3.13 hours) up to 5 years (mean 3.3 ± 2.31 hours; −53.5%; p < 0.048), and dyskinesias remained stable. Nursing time averaged 22 hours per patient per year after PEG-J insertion and titration. Most common complications were PEG-J tube dislodgement and stoma site infection (0–3zero to three events/patient/year). Serious side effects were seen in four (12%) patients resulting in hospitalization and/or death. Nine patients (27.2%) discontinued the treatment due to lack of improved efficacy over oral therapy or development of dementia and 10 (30%) died of causes unrelated to LCIG infusion.
Conclusion:
Patients on LCIG showed improved motor function over 5-year follow-up. Serious complications were uncommon. Dedicated nursing time is required by LCIG-trained nurses in a multidisciplinary setting for optimum management.
Kenneth Wilson introduced the renormalization-group (RG) approach in 1971. This approach gave new life to the study of the Ising model. The implications of this breakthrough were immediately recognized by researchers in the field, and Wilson and the RG technique were awarded the Nobel Prize in Physics soon thereafter. One of the distinguishing features of RG methods is that they explicitly include the effects of fluctuations. In addition, the RG approach gives a natural understanding of the universality that is seen in critical phenomena in general, and in critical exponents in particular. In many respects, the RG approach gives a deeper understanding not only of the Ising model itself, but of all aspects of critical phenomena. The original version of the renormalization-group method was implemented in momentum space – which is a bit like studying a system with Fourier transforms. It is beyond the scope of this presentation. Following that, various investigators extended the approach to position space, which is more intuitive in many ways and is certainly much easier to visualize. We present the basics of position-space renormalization group methods in this chapter. We will also explain the origin of the terms “renormalization” and “group” in the RG part of the name.
Cognitive fluctuations are a core clinical feature of dementia with Lewy bodies (DLB), but their contribution to the everyday functioning difficulties evident DLB are not well understood. The current study evaluated whether intraindividual variability across a battery of neurocognitive tests (intraindividual variability-dispersion) and daily cognitive fluctuations as measured by informant report are associated with worse daily functioning in DLB.
Methods:
The study sample included 97 participants with consensus-defined DLB from the National Alzheimer’s Coordinating Center (NACC). Intraindividual variability-dispersion was measured using the coefficient of variation, which divides the standard deviation of an individual’s performance scores across 12 normed neurocognitive indices from the NACC neuropsychological battery by that individual’s performance mean. Informants reported on daily cognitive fluctuations using the Mayo Fluctuations Scale (MFS) and on daily functioning using the functional activities questionnaire (FAQ).
Results:
Logistic regression identified a large univariate association of intraindividual variability-dispersion and presence of daily cognitive fluctuations on the MFS (Odds Ratio = 73.27, 95% Confidence Interval = 1.38, 3,895.05). Multiple linear regression demonstrated that higher intraindividual variability-dispersion and presence of daily cognitive fluctuations as assessed by the MFS were significantly and independently related to worse daily functioning (FAQ scores).
Conclusions:
Among those with DLB, informant-rated daily cognitive fluctuations and cognitive fluctuations measured in the clinic (as indexed by intraindividual variability-dispersion across a battery of tests) were independently associated with poorer everyday functioning. These data demonstrate ecological validity in measures of cognitive fluctuations in DLB.
We study approximations for the Lévy area of Brownian motion which are based on the Fourier series expansion and a polynomial expansion of the associated Brownian bridge. Comparing the asymptotic convergence rates of the Lévy area approximations, we see that the approximation resulting from the polynomial expansion of the Brownian bridge is more accurate than the Kloeden–Platen–Wright approximation, whilst still only using independent normal random vectors. We then link the asymptotic convergence rates of these approximations to the limiting fluctuations for the corresponding series expansions of the Brownian bridge. Moreover, and of interest in its own right, the analysis we use to identify the fluctuation processes for the Karhunen–Loève and Fourier series expansions of the Brownian bridge is extended to give a stand-alone derivation of the values of the Riemann zeta function at even positive integers.
Networked dynamical systems, i.e., systems of dynamical units coupled via nontrivial interaction topologies, constitute models of broad classes of complex systems, ranging from gene regulatory and metabolic circuits in our cells to pandemics spreading across continents. Most of such systems are driven by irregular and distributed fluctuating input signals from the environment. Yet how networked dynamical systems collectively respond to such fluctuations depends on the location and type of driving signal, the interaction topology and several other factors and remains largely unknown to date. As a key example, modern electric power grids are undergoing a rapid and systematic transformation towards more sustainable systems, signified by high penetrations of renewable energy sources. These in turn introduce significant fluctuations in power input and thereby pose immediate challenges to the stable operation of power grid systems. How power grid systems dynamically respond to fluctuating power feed-in as well as other temporal changes is critical for ensuring a reliable operation of power grids yet not well understood. In this work, we systematically introduce a linear response theory (LRT) for fluctuation-driven networked dynamical systems. The derivations presented not only provide approximate analytical descriptions of the dynamical responses of networks, but more importantly, also allow to extract key qualitative features about spatio-temporally distributed response patterns. Specifically, we provide a general formulation of a LRT for perturbed networked dynamical systems, explicate how dynamic network response patterns arise from the solution of the linearised response dynamics, and emphasise the role of LRT in predicting and comprehending power grid responses on different temporal and spatial scales and to various types of disturbances. Understanding such patterns from a general, mathematical perspective enables to estimate network responses quickly and intuitively, and to develop guiding principles for, e.g., power grid operation, control and design.
In this paper we study a large system of N servers, each with capacity to process at most C simultaneous jobs; an incoming job is routed to a server if it has the lowest occupancy amongst d (out of N) randomly selected servers. A job that is routed to a server with no vacancy is assumed to be blocked and lost. Such randomized policies are referred to JSQ(d) (Join the Shortest Queue out of d) policies. Under the assumption that jobs arrive according to a Poisson process with rate
$N\lambda^{(N)}$
where
$\lambda^{(N)}=\sigma-\frac{\beta}{\sqrt{N}\,}$
,
$\sigma\in\mathbb{R}_+$
and
$\beta\in\mathbb{R}$
, we establish functional central limit theorems for the fluctuation process in both the transient and stationary regimes when service time distributions are exponential. In particular, we show that the limit is an Ornstein–Uhlenbeck process whose mean and variance depend on the mean field of the considered model. Using this, we obtain approximations to the blocking probabilities for large N, where we can precisely estimate the accuracy of first-order approximations.
Missatributing symptoms to already established conditions often represents a missed opportunity to improve management. These cases provide some examples.
In this chapter we focus on the stress-energy bitensor and its symmetrized product, with two goals: (1) to present the point-separation regularization scheme, and (2) to use it to calculate the noise kernel that is the correlation function of the stress-energy bitensor and explore its properties. In the first part we introduce the necessary properties and geometric tools for analyzing bitensors, geometric objects that have support at two separate spacetime points. The second part presents the point-separation method for regularizing the ultraviolet divergences of the stress-energy tensor for quantum fields in a general curved spacetime. In the third part we derive a formal expression for the noise kernel in terms of the higher order covariant derivatives of the Green functions taken at two separate points. One simple yet important fact we show is that for a massless conformal field the trace of the noise kernel identically vanishes. In the fourth part we calculate the noise kernel for a conformal field in de Sitter space, both in the conformal Bunch–Davies vacuum and in the static Gibbons–Hawking vacuum. These results are useful for treating the backreaction and fluctuation effects of quantum fields.
Zeta-function regularization is arguably the most elegant of the four major regularization methods used for quantum fields in curved spacetime, linked to the heat kernel and spectral theorems in mathematics. The only drawback is that it can only be applied to Riemannian spaces (also called Euclidean spaces), whose metrics have a ++++ signature, where the invariant operator is of the elliptic type, as opposed to the hyperbolic type in pseudo-Riemannian spaces (also called Lorentzian spaces) with a −+++ signature. Besides, the space needs to have sufficiently large symmetry that the spectrum of the invariant operator can be calculated explicitly in analytic form. In the first part we define the zeta function, showing how to calculate it in several representative spacetimes and how the zeta-function regularization scheme works. We relate it to the heat kernel and derive the effective Lagrangian from it via the Schwinger proper time formalism. In the second part we show how to obtain the correlation function of the stress-energy bitensor, also known as the noise kernel, from the second metric variation of the effective action. Noise kernel plays a central role in stochastic gravity as much as the expectation values of stress-energy tensor do for semiclassical gravity.
Relationships research has used typologies and dimensional approaches to characterize relationships. Research has similarly sought to characterize on-off relationships using types, dimensions, and trajectories. Five types of on-off relationships are reviewed; some showing constructive patterns or closure, whereas others exhibit incongruencies, and even coercion and control. On-off relationships also exhibit varying trajectories. For example, despite inherently experiencing multiple breakups and renewals, only half of on-off partners reported fluctuations in their commitment to the relationship. In addition, the different types and trajectories suggest that fluctuations in relationship dynamics might not necessarily be distressing to these partners. Overall, these various characterizations of cyclical relationships show they follow a diversity of paths that entail different dynamics.
On-again/off-again relationships challenge the standard dichotomous definition of relationship stability (i.e., whether the relationship remains intact or dissolves). This chapter reviews the various conceptualizations of stability. Although on-off partners report less relationship stability when using subjective, one-time assessments (e.g., perceived stability, sense of security, or persistence in the relationship), a process-oriented assessment of stability is advocated in which relationship dynamics are measured over time. As argued by chaos theory, fluctuations over time could indicate a stable pattern. For example, research suggests certain fluctuations are associated with greater stability in on-off relationships. A process-oriented perspective could thus provide a more nuanced assessment of relationship stability for both on-off and non-cyclical relationships. Additional considerations for future research are also offered.
We analyze the fluctuations in the X-ray flux of 20 AGN (mainly Seyfert 1 galaxies) monitored by RXTE and XMM-Newton with a sampling frequency ranging from hours to years, using structure function (SF) analysis. We derive SFs over four orders of magnitude in the time domain (0.03-300 days). Most objects show a characteristic time scale, where the SF flattens or changes slope. For 10 objects with published power-spectral density (PSD) the break time scales in the SF and PSD are similar and show a good correlation. We also find a significant correlation between the SF timescale and the mass of the central black hole, determined for most objects by reverberation mapping.
We analyze the Lattice Boltzmann method for the simulation of fluctuating hydrodynamics by Adhikari et al. [Europhys. Lett., 71 (2005), 473-479] and find that it shows excellent agreement with theory even for small wavelengths as long as a stationary system is considered. This is in contrast to other finite difference and older lattice Boltzmann implementations that show convergence only in the limit of large wavelengths. In particular cross correlators vanish to less than 0.5%. For larger mean velocities, however, Galilean invariance violations manifest themselves.
We perform a numerical study of the fluctuations of the rescaled hydrodynamic transversevelocity field during the cooling state of a homogeneous granular gas. We are interestedin the role of Molecular Chaos for the amplitude of the hydrodynamic noise and itsrelaxation in time. For this purpose we compare the results of Molecular Dynamics (MD,deterministic dynamics) with those from Direct Simulation Monte Carlo (DSMC, randomprocess), where Molecular Chaos can be directly controlled. It is seen that the large timedecay of the fluctuation’s autocorrelation is always dictated by the viscosity coefficientpredicted by granular hydrodynamics, independently of the numerical scheme (MD or DSMC).On the other side, the noise amplitude in Molecular Dynamics, which is known toviolate the equilibrium Fluctuation-Dissipation relation, is not alwaysaccurately reproduced in a DSMC scheme. The agreement between the two models improves ifthe probability of recollision (controlling Molecular Chaos) is reduced by increasing thenumber of virtual particles per cells in the DSMC. This result suggests that DSMC is notnecessarily more efficient than MD, if the real number of particles is small(~103 ± 104) and if one is interested in accurately reproducefluctuations. An open question remains about the small-times behavior of theautocorrelation function in the DSMC, which in MD and in kinetic theory predictions is nota straight exponential.
Consider a random environment in ${\mathbb Z}^d$ given by i.i.d. conductances.In this work, we obtain tail estimates for the fluctuations about themean for the following characteristics of the environment: the effective conductance between opposite faces of a cube, the diffusion matrices of periodized environmentsand the spectral gap of the random walk in a finite cube.
The charge on plasma ions may fluctuate due to random ionization and recombination processes. The resulting motion has a time dependent Hamiltonian and in simple cases, the ions steadily gain energy. Quantitative rates are estimated for some cases involving high Z plasmas.