We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
For many researchers, the ethical approval process can appear confusing, overwhelming, or irrelevant. Common sources of confusion include knowing which types of ethics approvals are required, how to get the approval, and understanding the language surrounding the review process. This editorial discusses the importance of ethics in creating and reporting quality research and provides a practical guide to help navigate the ethical approval process.
We consider a stochastic model, called the replicator coalescent, describing a system of blocks of k different types that undergo pairwise mergers at rates depending on the block types: with rate $C_{ij}\geq 0$ blocks of type i and j merge, resulting in a single block of type i. The replicator coalescent can be seen as a generalisation of Kingman’s coalescent death chain in a multi-type setting, although without an underpinning exchangeable partition structure. The name is derived from a remarkable connection between the instantaneous dynamics of this multi-type coalescent when issued from an arbitrarily large number of blocks, and the so-called replicator equations from evolutionary game theory. By dilating time arbitrarily close to zero, we see that initially, on coming down from infinity, the replicator coalescent behaves like the solution to a certain replicator equation. Thereafter, stochastic effects are felt and the process evolves more in the spirit of a multi-type death chain.
This study aimed to evaluate the general practitioner (GP) referral pathway for adult attention deficit hyperactivity disorder (ADHD) devised by the Irish Health Service Executive’s (HSE) National Clinical Programme for Adult ADHD (NCPAA). Primary objectives were to (i) quantify GP referrals to community mental health teams (CMHTs) for adult ADHD screening, (ii) measure workload on CMHTs related to screening adult ADHD referrals without comorbid mental health problems, and (iii) quantify access to adult ADHD screening through CMHTs and subsequent assessment and treatment access through specialist adult ADHD teams.
Methods:
An observational cohort design was used to retrospectively analyse ADHD-related referral data collected by clinical staff across 11 Irish CMHTs, and three specialist adult ADHD teams from January to December 2023.
Results:
There was high variability in adult ADHD referrals to CMHTs, ranging from 14 to 122 over one year. There was also high variability in the number of referrals seen by CMHTs, ranging from 9 to 82. From 304 referrals seen across 11 CMHTs, 25.3% required initial treatment for another mental health condition. Specialist adult ADHD teams received 3–4 times more referrals than they were able to assess during this timeframe.
Conclusions:
The NCPAA has provided crucial services for adults with ADHD in Ireland. However, an increase in neurodiversity awareness and demand for services suggests that a range of referral pathways depending on complexity level may be required. Alternative models are proposed, which require allocation of resources and training through primary care, secondary mental health services and specialist teams.
Scholars and policymakers bemoan an imperial presidency in the war powers context, where the unilateral use of force is frequently interpreted as evidence of an unconstrained executive. Focusing on the strong blame avoidance incentives faced by politicians in the military intervention setting, I develop a model of the war powers focused on “Loss Responsibility Costs.” It suggests that presidents only risk full-scale war when they have the political cover provided by formal authorization, which forces lawmakers to share responsibility. Smaller interventions, in contrast, are frequently undertaken unilaterally because having the president act alone is consistent with congressional preferences for blame avoidance. Novel sentiment data based on tens of thousands of congressional speeches supports the claim that when the president acts unilaterally, they almost always act alongside lawmaker support, who favor intervention but avoid formally endorsing the endeavor. Altogether, it suggests legislators’ influence over war is stronger than commonly appreciated.
A dual scaling of the second-order scalar structure function $\overline {{(\delta \theta )}^2}$, i.e. a scaling based on the Batchelor–Kolmogorov scales $\theta _B$, $\eta$ and another based on $\theta '$, $L$, representative of the large-scale motion, is examined in the context of the transport equation for $\overline {{(\delta \theta )}^2}$. Direct numerical simulation data over a relatively wide range of the Taylor microscale Reynolds number $Re_\lambda$ and a Schmidt number of order 1 in statistically stationary homogeneous isotropic turbulence with a uniform mean scalar gradient are used. It is observed that as $Re_\lambda$ increases, a dual scaling appears to emerge, where the scaling based on $\theta '$, $L$ extends to increasingly smaller values of $r/L$, where $r$ is the separation associated with the increment $ {{\delta \theta }}$, while the scaling based on $\theta _B$, $\eta$ extends to increasingly larger values of $r/\eta$. This suggests that both scalings should eventually overlap over a range of scales as $Re_\lambda$ continues to increase. Further, it is shown that such a dual scaling leads to the power-law relation $\overline {{(\delta \theta )}^2} \sim r^{\zeta _2}$, where $\zeta _2=2/3$ in the overlap region. The use of an empirical model for the local slope of $\overline {{(\delta \theta )}^2}$ (i.e. $\zeta _2$) shows that a value of $Re_\lambda$ of order $10^4$ is required for the slope to first reach the value $2/3$. Clearly, values larger than $10^4$ will be required before a $r^{2/3}$ inertial range is established.
Let E be an elliptic curve defined over ${{\mathbb{Q}}}$ which has good ordinary reduction at the prime p. Let K be a number field with at least one complex prime which we assume to be totally imaginary if $p=2$. We prove several equivalent criteria for the validity of the $\mathfrak{M}_H(G)$-property for ${{\mathbb{Z}}}_p$-extensions other than the cyclotomic extension inside a fixed ${{\mathbb{Z}}}_p^2$-extension $K_\infty/K$. The equivalent conditions involve the growth of $\mu$-invariants of the Selmer groups over intermediate shifted ${{\mathbb{Z}}}_p$-extensions in $K_\infty$, and the boundedness of $\lambda$-invariants as one runs over ${{\mathbb{Z}}}_p$-extensions of K inside of $K_\infty$.
Using these criteria we also derive several applications. For example, we can bound the number of ${{\mathbb{Z}}}_p$-extensions of K inside $K_\infty$ over which the Mordell–Weil rank of E is not bounded, thereby proving special cases of a conjecture of Mazur. Moreover, we show that the validity of the $\mathfrak{M}_H(G)$-property sometimes can be shifted to a larger base field K′.
Oscillations of a heated solid surface in an oncoming fluid flow can increase heat transfer from the solid to the fluid. Previous studies have investigated the resulting heat transfer enhancement for the case of a circular cylinder undergoing translational or rotational motions. Another common geometry, the flat plate, has not been studied as thoroughly. The flat plate sheds larger and stronger vortices that are sensitive to the plate’s direction of oscillation. To study the effect of these vortices on heat transfer enhancement, we conduct two-dimensional numerical simulations to compute the heat transfer from a flat plate with different orientations and oscillation directions in an oncoming flow with Reynolds number 100. We consider plates with fixed temperature and fixed heat flux, and find large heat transfer enhancement in both cases. We investigate the effects of the plate orientation angle and the plate oscillation direction, velocity, amplitude and frequency, and find that the plate oscillation velocity and direction have the strongest effects on global heat transfer. The other parameters mainly affect the local heat transfer distributions through shed vorticity distributions. We also discuss the input power needed for the oscillating-plate system and the resulting Pareto optimal cases.
The primary focus of this article is to capture heterogeneous treatment effects measured by the conditional average treatment effect. A model averaging estimation scheme is proposed with multiple candidate linear regression models under heteroskedastic errors, and the properties of this scheme are explored analytically. First, it is shown that our proposal is asymptotically optimal in the sense of achieving the lowest possible squared error. Second, the convergence of the weights determined by our proposal is provided when at least one of the candidate models is correctly specified. Simulation results in comparison with several related existing methods favor our proposed method. The method is applied to a dataset from a labor skills training program.
I argue that the epistemic aim of scientific theorizing (EAST) is producing theories with the highest possible number and degree of theoretical virtues (call this “TV-EAST”). I trace TV-EAST’s logical empiricist origins and discuss its close connections to Kuhn’s and Laudan’s problem-solving accounts of the aim of science. Despite TV-EAST’s antirealist roots, I argue that if one adopts the realist view that EAST is finding true theories, one should also endorse TV-EAST. I then defend TV-EAST by showing that it addresses the challenges raised against using the “aim of science” metaphor and offers significant advantages over the realist account.
A 13,500-year-old record from Langohr Wetland in the Gallatin Range of southwestern Montana offers new insights into the vegetation and fire history at middle elevations within the Greater Yellowstone Ecosystem. Pollen data suggest that following deglaciation, a tundra–steppe established until warmer and wetter conditions than before could support Picea parkland. The development of an open, predominantly Pinus mixed-conifer forest from ca. 9300–7000 cal yr BP suggests warming summer temperatures led to an increase in forest cover and fire activity; the increase in tree abundance supported infrequent, stand-replacing fires approximately every 600 years. Picea and Pseudotsuga increased their presence at ca. 7000 cal yr BP, and the mixed-conifer forest became denser during the Mid- and Late Holocene, suggesting summers became cooler and wetter. The additional fuel load led to increased fire activity, with stand-replacing fires occurring approximately every 350 years in the Late Holocene. The forest surrounding Langohr Wetland experienced less change in vegetation composition and structure and fewer fire episodes than other low- and high-elevation sites in the Greater Yellowstone Ecosystem. The stability of this forested ecosystem over thousands of years is likely a result of its cool mesic mid-elevation setting, limiting the frequency of intense fire episodes.
Detecting cracks in underwater dams is crucial for ensuring the quality and safety of the dam. However, underwater dam cracks are easily obscured by aquatic plants. Traditional single-view visual inspection methods cannot effectively extract the feature information of the occluded cracks, while multi-view crack images can extract the occluded target features through feature fusion. At the same time, underwater turbulence leads to nonuniform diffusion of suspended sediments, resulting in nonuniform flooding of image feature noise from multiple viewpoints affecting the fusion effect. To address these issues, this paper proposes a multi-view fusion network (MVFD-Net) for crack detection in occluded underwater dams. First, we propose a feature reconstruction interaction encoder (FRI-Encoder), which interacts the multi-scale local features extracted by the convolutional neural network with the global features extracted by the transformer encoder and performs the feature reconstruction at the end of the encoder to enhance the feature extraction capability and at the same time in order to suppress the interference of the nonuniform scattering noise. Subsequently, a multi-scale gated adaptive fusion module is introduced between the encoder and the decoder for feature gated fusion, which further complements and recovers the noise flooding detail information. Additionally, this paper designs a multi-view feature fusion module to fuse multi-view image features to restore the occluded crack features and achieve the detection of occluded cracks. Through extensive experimental evaluations, the MVFD-Net algorithm achieves excellent performance when compared with current mainstream algorithms.
In this article, we consider the relationship between conceptual blending, creativity and morphological change, within the framework of Diachronic Construction Morphology (DCxM; Norde & Trousdale 2023). In particular, we suggest that a refinement to models of creativity in the literature might help to account better for different types of morphological change (Norde & Trousdale 2024). This is achieved via a contrastive analysis of two different sets of changes: (a) the creation of English libfixes (Zwicky 2010; Norde & Sippach 2019), e.g. snowmaggedon and spooktacular, and (b) the development of Dutch pseudoparticiples (Norde & Trousdale 2024), e.g. bebrild ‘bespectacled’ and ontstekkerd ‘with all plugs removed’.
Semantic extensibility captures the semantic side of productivity. It is the likelihood that a given sense of a linguistic expression will support extension to new senses. Even though linguistic expressions are naturally polysemous, semantic extensibility is constrained. In previous literature, it has been argued that semantic extensions are motivated by mostly one-directional conceptual operations such as metaphor and metonymy, and that in any polysemous expression only one or a few so-called ‘sanctioning’ senses have privileged status in supporting new extensions. One factor believed to determine sanctioning status is high frequency. Drawing on three case studies from the history of English, involving change in the adjective awful, the preposition and adverb about and the multifunctional item so, this article provides diachronic evidence from semantic loss to support this view. On the one hand, it is shown that when old sanctioning senses go into decline, this also impacts the senses derived from them, underscoring the motivational relations that tie extended senses to sanctioning senses. On the other hand, what typically initiates a decline in a sanctioning sense is a frequency increase elsewhere in the polysemy network coincident with the emergence of a new sanctioning sense, underscoring the role of frequency in determining sanctioning status and the directionality of sanctioning relations.