To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This chapter resolves the problem posed in the previous chapter, namely, the Scotist objection to the instrument doctrine. It argues that of five different strategies, only one solves the problem of coherence, and it is a solution found not only in some of Aquinas’s mature statements on instrumental causality but also in the theology of Matthias Joseph Scheeben (1835-1888), who knew the intricate debates about the doctrine after Aquinas’s time and had developed a unique response to the Scotist objection. The chapter defends Scheeben’s view, known as ’extrinsic elevation’, as the way to preserve the coherence of the two claims that God alone is the cause of grace and that Christ’s humanity is an instrumental efficient cause of grace.
This chapter poses the most difficult objection for the instrument doctrine, in particular as Aquinas conceives of it. For Aquinas, a created cause, Christ’s humanity, produces divine effects as an instrumental cause. But the tradition has affirmed that God alone is the cause of grace in the soul, and no created cause can produce grace. John Duns Scotus puts this objection to Aquinas’s account of instrumental causality, and this chapter argues that the criticism appears to succeed. If a created cause participates in the production of grace, as Aquinas argues, then Scotus argues that Aquinas fails to maintain the distinction of natures and powers in Christ basic to Chalcedonian Christology. For Christ’s humanity is taken up into God’s power and brings about the deification of the human person immediately, something only divine power can do. The ground is prepared for a response to this objection in the following chapter.
Scripture teaches that God saves humanity through God's own actions and sufferings in Christ, thereby raising a key theological question: How can God use his own human actions and sufferings to bring about those things that he causes through divine power? To answer that question, J. David Moser here explores St. Thomas Aquinas's teaching that Christ's humanity is an instrument of the divinity. Offering an informed account of how Christian salvation happens through the Incarnation of Christ, he also poses a new set of questions about the Incarnation that Aquinas himself did not consider. In response to these questions, and in conversation with a wide range of theologians, including John Duns Scotus and Matthias Joseph Scheeben, Moser argues that the instrument doctrine, an underexplored and underappreciated idea, deepens our understanding of salvation that comes through the Incarnation of Jesus Christ. He also defends the instrument doctrine as a dogmatic theological topic worthy of consideration today.
From cradle to grave human beings actively strive to abstract meaning from experience. The meaning making capacity builds step-by-step, beginning in the earliest years. At each phase of life, new capacities emerge and previous limitations in meaning making can be overcome. By adolescence all of the basic tools for making meaning have been acquired. All that remains to be achieved is the wisdom that comes from accrued lived experience in the subsequent years. Increasingly, a narrative identity may be formed.
Preschoolers purposefully seek to understand the environment. Their social world expands, as they work hard to interact with peers. Accompanying this active stance are surges in memory and other aspects of understanding the self in the world. Coherence in the organization of the emerging person becomes much more apparent, in both behavior and the child’s internal world, as seen in the child’s representation. Inquisitiveness and beginning understanding of causality are major strengths of the toddler. Because this understanding is limited, toddlers may at times attribute too much credence to their own perspective and too great a role to the self in causation. Thus, they can feel bad when negative experiences such as divorce happen, believing that they are the cause.
The chapter provides a novel account of perceptual discrimination (krinein) in Aristotle. Against the widespread view that the most basic perceptual acts consist in noticing differences between two or more perceived qualities, I argue that discrimination is for Aristotle more like sifting, winnowing on a sieve: it consists in identifying – with an ultimate authority – the quality of an external object as distinct from any other quality of the given range that the object could have. The chapter further explores how the notion of discrimination is embedded by Aristotle within his causal assimilation model of perception. I argue that the central notion of a discriminative mean (mesotēs), introduced in An. 2.11, is intended to capture the role of the perceptive soul as the controlling factor of a homeostatic mechanism underlying perception. As such the notion lays the groundwork for resolving the apparent conflict between the passivity of perception and the impassivity of the soul (as analysed in Chapter 5). The prospect is further explored in Chapter 7. The present chapter concludes by arguing that Aristotle conceives perceptual discrimination as a holistic assessment of the external object acting on the perceiver, including those of its features which are not causally efficacious.
We present PCFTL (Probabilistic CounterFactual Temporal Logic), a new probabilistic temporal logic for the verification of Markov Decision Processes (MDP). PCFTL introduces operators for causal inference, allowing us to express interventional and counterfactual queries. Given a path formula ϕ, an interventional property is concerned with the satisfaction probability of ϕ if we apply a particular change I to the MDP (e.g., switching to a different policy); a counterfactual formula allows us to compute, given an observed MDP path τ, what the outcome of ϕ would have been had we applied I in the past and under the same random factors that led to observing τ. Our approach represents a departure from existing probabilistic temporal logics that do not support such counterfactual reasoning. From a syntactic viewpoint, we introduce a counterfactual operator that subsumes both interventional and counterfactual probabilities as well as the traditional probabilistic operator. This makes our logic strictly more expressive than PCTL⋆. The semantics of PCFTL rely on a structural causal model translation of the MDP, which provides a representation amenable to counterfactual inference. We evaluate PCFTL in the context of safe reinforcement learning using a benchmark of grid-world models.
This chapter considers the multivariate case, extending the univariate concepts to the vector time series case. We consider vector autoregressions from different points of view.
This Element intends to contribute to the debate between Islam and science. It focuses on one of the most challenging issues in the modern discussion on the reconciliation of religious and scientific claims about the world, which is to think about divine causality without undermining the rigor and efficacy of the scientific method. First, the Element examines major Islamic accounts of causality. Then, it provides a brief overview of contemporary debates on the issue and identifies both scientific and theological challenges. It argues that any proposed Islamic account of causality for the task of reconciliation should be able to preserve scientific rigor without imposing a priori limits on scientific research, account for miracles without turning them into science-stoppers or metaphors, secure divine and creaturely freedom, and establish a strong sense of divine presence in the world. Following sections discuss strengths and weaknesses of each account in addressing these challenges.
The first chapter contains an overview of what is accepted as good practice. We review several general ethical guidelines. These can be used to appreciate good research and to indicate where and how research does not adhere to them. Good practice is “what we all say we (should) adhere to.” In the second part of this chapter, the focus is more on specific ethical guidelines for statistical analysis. Of course, there is overlap with the more general guidelines, but there are also a few specifically relevant to statistics: Examples are misinterpreting p values and malpractice such as p hacking and harking.
Chapter 10 examines correlation, the statistical procedure used to measure the degree of association or relationship between variables. It is bivariate, since we typically apply this statistical technique to measure or describe the association between two variables or groups. The correlation coefficient, which measures the degree and direction of an association, is discussed, as are some of the issues regarding the application and interpretation of correlations. The chapter also outlines the many different measures of association but focuses on Pearson’s r. It emphasizes the definitional formula and z-scores for understanding and computing the correlation coefficient.
This is a translation of the excerpts published in Naturwissenschaften of Grete Hermann’s 1935 essay on philosophy of quantum mechanics, recently translated into English. Her main thesis, in line with her natural-philosophical training and neo-Kantian commitments, is to argue that quantum mechanics does not refute the principle of causality. Quantum mechanics cannot be completed by, hidden variables, because it is already causally complete (albeit retroductively). In establishing this provocative thesis, she makes important use of Bohr’s principles of correspondence and complementarity and of Weizsäcker's version of the gamma-ray microscope, arguing that the lesson of quantum mechanics is the impossibility of an absolute description of nature independent of the context of observation.
The chapter begins with the observation that global history has an ambivalent attitude towards explanation. In many cases, the mere presentation of sources and voices from many different parts of the world seems sufficient to justify a global approach. The need for explanation is ignored or even denied. In other cases, global explanation is eagerly pursued, but often at the expense of more complex explanatory models that incorporate factors at different scales. In this perspective, global explanations are claimed to be inherently superior and a privileged way of explaining historical phenomena. After a cursory survey of current positions on causality and explanation in general methodology and ‘formal’ historical theory, the chapter proposes a brief typology of explanatory strategies. It goes on to discuss the peculiarities of explanation within a framework of connections across great distances and cultural boundaries. The much-exclaimed concept of narrative explanation is found to be of limited value, as it underestimates the difficulties of producing coherent narratives on a global scale. Concepts offered in the social science literature, such as the analysis of mechanisms and temporal sequences, could be helpful in refining purely narrative approaches to explanation.
from
Part I
-
The Philosophy and Methodology of Experimentation in Sociology
Davide Barrera, Università degli Studi di Torino, Italy,Klarita Gërxhani, Vrije Universiteit, Amsterdam,Bernhard Kittel, Universität Wien, Austria,Luis Miller, Institute of Public Goods and Policies, Spanish National Research Council,Tobias Wolbring, School of Business, Economics and Society at the Friedrich-Alexander-University Erlangen-Nürnberg
Sociology is a science concerning itself with the interpretive understanding of social action and thereby with a causal explanation of its course and consequences. Empirically, a key goal is to find relations between variables. This is often done using naturally occurring data, survey data, or in-depth interviews. With such data, the challenge is to establish whether a relation between variables is causal or merely a correlation. One approach is to address the causality issue by applying proper statistical or econometric techniques, which is possible under certain conditions for some research questions. Alternatively, one can generate new data with experimental control in a laboratory or the field. It is precisely through this control via randomization and the manipulation of the causal factors of interest that the experimental method ensures – with a high degree of confidence – tests of causal explanations. In this chapter, the canonical approach to causality in randomized experiments (the Neyman–Rubin causal model) is first introduced. This model formalizes the idea of causality using the "potential outcomes" or "counterfactual" approach. The chapter then discusses the limits of the counterfactual approach and the key role of theory in establishing causal explanations in experimental sociology.
from
Part III
-
Methodological Challenges of Experimentation in Sociology
Davide Barrera, Università degli Studi di Torino, Italy,Klarita Gërxhani, Vrije Universiteit, Amsterdam,Bernhard Kittel, Universität Wien, Austria,Luis Miller, Institute of Public Goods and Policies, Spanish National Research Council,Tobias Wolbring, School of Business, Economics and Society at the Friedrich-Alexander-University Erlangen-Nürnberg
This chapter addresses the often-misunderstood concept of validity. Much of the methodological discussion around sociological experiments is framed in terms of internal and external validity. The standard view is that the more we ensure that the experimental treatment is isolated from potential confounds (internal validity), the more unlikely it is that the experimental results can be representative of phenomena of the outside world (external validity). However, other accounts describe internal validity as a prerequisite of external validity: Unless we ensure internal validity of an experiment, little can be said of the outside world. We contend in this chapter that problems of either external or internal validity do not necessarily depend on the artificiality of experimental settings or on the laboratory–field distinction between experimental designs. We discuss the internal–external distinction and propose instead a list of potential threats to the validity of experiments that includes "usual suspects" like selection, history, attrition, and experimenter demand effects and elaborate on how these threats can be productively handled in experimental work. Moreover, in light of the different types of experiments, we also discuss the strengths and weaknesses of each regarding threats to internal and external validity.
This article critically evaluates Jeffrey Koperski’s decretalism, which presents the laws of nature as divine decrees functioning as constraints rather than dynamic forces. Building on his work, we explore whether his model successfully avoids the implications of occasionalism, as he claims. By analysing his latest publications, we first reconstruct Koperski’s argument and then present three key objections. These include (1) issues related to scientific realism, (2) the principle of simplicity, and (3) the reduction of Koperski’s model to occasionalism. We argue that despite his attempts to distinguish his framework, Koperski’s model ultimately collapses into occasionalism due to the continuous divine sustenance required for natural processes. By engaging with recent developments in metaphysical and scientific debates, this article highlights the limitations of Koperski’s decretalism.
Authentic leadership studies are often criticised for the limited use of causally defined research designs. To advance scholarship is this area, this article presents a scoping review on the use of experimental designs to examine causality in authentic leadership. Eleven publications were identified, which presented 16 experiments that met the inclusion criteria. Generally, these experiments tested authentic leadership as an antecedent; were conducted online; used a one-factor design; involved large samples, typically of working adults or residents; involved a manipulation check; involved the use of written vignettes to manipulate levels of authentic leadership; included counterfactual conditions; culminated with outcomes pertaining to followers; and established the causal effects of authentic leadership on the outcome(s) of interest. These findings suggest the value of: written vignettes; multi-method approaches; and online experiments. They also highlight opportunities to advance authentic leadership research through the use of sequential experiments and immersive technologies.
An important contributor to the decreased life expectancy of individuals with schizophrenia is sudden cardiac death. Arrhythmic disorders may play an important role herein, but the nature of the relationship between schizophrenia and arrhythmia is unclear.
Aims
To assess shared genetic liability and potential causal effects between schizophrenia and arrhythmic disorders and electrocardiogram (ECG) traits.
Method
We leveraged summary-level data of large-scale genome-wide association studies of schizophrenia (53 386 cases, 77 258 controls), arrhythmic disorders (atrial fibrillation, 55 114 cases, 482 295 controls; Brugada syndrome, 2820 cases, 10 001 controls) and ECG traits (heart rate (variability), PR interval, QT interval, JT interval and QRS duration, n = 46 952–293 051). We examined shared genetic liability by assessing global and local genetic correlations and conducting functional annotation. Bidirectional causal relations between schizophrenia and arrhythmic disorders and ECG traits were explored using Mendelian randomisation.
Results
There was no evidence for global genetic correlation, except between schizophrenia and Brugada syndrome (rg = 0.14, 95% CIs = 0.06–0.22, P = 4.0E−04). In contrast, strong positive and negative local correlations between schizophrenia and all cardiac traits were found across the genome. In the most strongly associated regions, genes related to immune and viral response mechanisms were overrepresented. Mendelian randomisation indicated that liability to schizophrenia causally increases Brugada syndrome risk (beta = 0.14, CIs = 0.03–0.25, P = 0.009) and heart rate during activity (beta = 0.25, CIs = 0.05–0.45, P = 0.015).
Conclusions
Despite little evidence for global genetic correlation, specific genomic regions and biological pathways emerged that are important for both schizophrenia and arrhythmia. The putative causal effect of liability to schizophrenia on Brugada syndrome warrants increased cardiac monitoring and early medical intervention in people with schizophrenia.
At the basis of many important research questions is causality – does X causally impact Y? For behavioural and psychiatric traits, answering such questions can be particularly challenging, as they are highly complex and multifactorial. ‘Triangulation’ refers to prospectively choosing, conducting and integrating several methods to investigate a specific causal question. If different methods, with different sources of bias, all indicate a causal effect, the finding is much less likely to be spurious. While triangulation can be a powerful approach, its interpretation differs across (sub)fields and there are no formal guidelines. Here, we aim to provide clarity and guidance around the process of triangulation for behavioural and psychiatric epidemiology, so that results of existing triangulation studies can be better interpreted, and new triangulation studies better designed.
Methods
We first introduce the concept of triangulation and how it is applied in epidemiological investigations of behavioural and psychiatric traits. Next, we put forth a systematic step-by-step guide, that can be used to design a triangulation study (accompanied by a worked example). Finally, we provide important general recommendations for future studies.
Results
While the literature contains varying interpretations, triangulation generally refers to an investigation that assesses the robustness of a potential causal finding by explicitly combining different approaches. This may include multiple types of statistical methods, the same method applied in multiple samples, or multiple different measurements of the variable(s) of interest. In behavioural and psychiatric epidemiology, triangulation commonly includes prospective cohort studies, natural experiments and/or genetically informative designs (including the increasingly popular method of Mendelian randomization). The guide that we propose aids the planning and interpreting of triangulation by prompting crucial considerations. Broadly, its steps are as follows: determine your causal question, draw a directed acyclic graph, identify available resources and samples, identify suitable methodological approaches, further specify the causal question for each method, explicate the effects of potential biases and, pre-specify expected results. We illustrated the guide’s use by considering the question: ‘Does maternal tobacco smoking during pregnancy cause offspring depression?’.
Conclusions
In the current era of big data, and with increasing (public) availability of large-scale datasets, triangulation will become increasingly relevant in identifying robust risk factors for adverse mental health outcomes. Our hope is that this review and guide will provide clarity and direction, as well as stimulate more researchers to apply triangulation to causal questions around behavioural and psychiatric traits.
Psychiatric research applies statistical methods that can be divided in two frameworks: causal inference and prediction. Recent proposals suggest a down-prioritisation of causal inference and argue that prediction paves the road to ‘precision psychiatry’ (i.e., individualised treatment). In this perspective, we critically appraise these proposals.
Methods:
We outline strengths and weaknesses of causal inference and prediction frameworks and describe the link between clinical decision-making and counterfactual predictions (i.e., causality). We describe three key causal structures that, if not handled correctly, may cause erroneous interpretations, and three pitfalls in prediction research.
Results:
Prediction and causal inference are both needed in psychiatric research and their relative importance is context-dependent. When individualised treatment decisions are needed, causal inference is necessary.
Conclusion:
This perspective defends the importance of causal inference for precision psychiatry.