To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The concepts of inductive and deductive inference are introduced and contrasted. An artificial example is used to emphasize the logical structure of the problem of induction. To see how the problem of induction relates (and also does not relate) to a real episode of experimental inquiry, this chapter considers the case of Isaac Newton’s optical experiments using prisms to investigate the refraction of light. Although Newton did not concern himself with the problem of induction as philosophers now understand it, he used experimental strategies designed to address possible errors in the conclusions about light that he drew from his observations.
This chapter surveys influential ideas about scientific explanation. The idea that scientific explanation is a matter of logical deduction from scientific laws has played an important role both as the basis for positive accounts of scientific explanation and as a target of critical arguments spurring the investigation of alternative views. The chapter reviews some of the reasons in favor holding such a covering-law view of explanation and then turn to some alternatives. The chapter also considers a pragmatically oriented account of the act of explaining. Another alternative focuses on the idea that explanations unify phenomena, showing how seemingly different things are manifestations of a single truth about nature. Several approaches emphasize the way explanations indicate what causes something to happen, whether by reference to a process, a possible manipulation, or a mechanism.
The chapter reviews an approach to the development of a ‘scientific philosophy’ that developed in the early decades of the twentieth century in Central Europe. Logical empiricists combined an interest in using the resources of formal logic and an empiricist orientation to propose ways of distinguishing meaningful scientific discourse from what they regarded as cognitively meaningless metaphysical statements. In so doing, they articulated important and influential ideas about how to characterize the relationship between observations serving as evidence and the theories for which they are relevant. The chapter also examines their assumptions about the nature and structure of physical theories and how those shaped efforts such as Rudolf Carnap’s development of a theory purporting to quantify how much a particular body of evidence confirms a particular theory.
One philosophical approach that directly responds to the problem of induction is falsificationism, first proposed by Karl Popper. This chapter examines how falsificationists propose to account for the growth of scientific knowledge without appealing to inductive reasoning. Their approach relies on attempts to falsify general hypotheses through experiments and observations. Additional logical concepts are introduced in this chapter to facilitate the logical analysis of such falsification. The concept of corroboration, central to the falsificationist view, is introduced. The apparatus of falsificationism is applied to the example of Newton’s optical experiments introduced in Chapter 1. Finally, falsificationism is discussed in relation to conventionalism, a philosophical idea that in some ways falsificationism attacks and in other ways exemplifies.
In practice, much of statistical reasoning in science relies on probabilities subject to interpretation as relative frequencies. This chapter explains how probability can be understood in terms of relative frequencies and the uses scientists and philosophers have devised for frequentist probabilities. Particularly prominent in those uses are error probabilities associated with particular approaches to hypothesis testing. The approaches pioneered by Ronald Fisher and by Jerzy Neyman and Egon Pearson are outlined and explained through examples. The chapter then explores the error-statistical philosophy advocated by Deborah Mayo as a general framework for thinking about how we learn from empirical data. The error-statistical approach utilizes a frequentist framework for probabilities to articulate a view of severe testingof hypotheses as the means by which scientists increase experimental knowledge. Error statistics represents an important alternative to Bayesian approaches to scientific inquiry, and this chapter considers its prospects and challenges.
According to the value-free ideal of science, scientists should draw their conclusions in a manner free of influence from value judgments. This ideal lends itself to a variety of interpretations and specifications. The ideal also faces numerous challenges that call into question not only whether it can be achieved but whether it really constitutes an ideal scientists ought to use to guide their actions. The chapter considers whether and in what conditions the value judgments of scientists might prevent or facilitate the achievement of scientific objectivity. From the role of value judgments in science the chapter turns to the closely related question of the appropriate role of scientists in the formulation of public policies. In many situations, the consideration of scientific evidence and scientific research bears importantly on questions of policy. The chapter then considers the complicated relationship between the reliance of policymakers on scientific expertise and the goals of democratic accountability and the public good.
Science is part of society, and scientific culture is part of a broader culture from which it gets much of its character. Sexism and patriarchy have been pervasive influences throughout the historical process that leads to our present scientific culture, with significant effects on science and scientists. Feminist thinkers have grappled with the problem of sexism in science and have developed a variety of philosophical responses to it. This chapter surveys some of those responses, with a focus on the ideas of feminist empiricism and feminist standpoint theory. Both approaches argue that incorporating feminist ideas will enable scientific communities to better achieve scientific aims of knowledge and objectivity, although they disagree on which feminist ideas are best suited to achieve this. The chapter also considers ways in which the two approaches have become more alike as they developed over the past several decades, hinting at a possible synthesis of the two approaches.
Since the publication of the first edition of this highly regarded textbook, the value of data assimilation has become widely recognized across the Earth sciences and beyond. Data assimilation methods are now being applied to many areas of prediction and forecasting, including extreme weather events, wildfires, infectious disease epidemics, and economic modeling. This second edition provides a broad introduction to applications across the Earth systems and coupled Earth–human systems, with an expanded range of topics covering the latest developments of variational, ensemble, and hybrid data assimilation methods. New toy models and intermediate-complexity atmospheric general circulation models provide hands-on engagement with key concepts in numerical weather prediction, data assimilation, and predictability. The inclusion of computational projects, exercises, lecture notes, teaching slides, and sample exams makes this textbook an indispensable and practical resource for advanced undergraduate and graduate students, researchers, and practitioners who work in weather forecasting and climate prediction.
Focused on empirical methods and their applications to corporate finance, this innovative text equips students with the knowledge to analyse and critically evaluate quantitative research methods in corporate finance, and conduct computer-aided statistical analyses on various types of datasets. Chapters demonstrate the application of basic econometric models in corporate finance (as opposed to derivations or theorems), backed up by relevant research. Alongside practical examples and mini case studies, computer lab exercises enable students to apply the theories of corporate finance and make stronger connections between theory and practice, while developing their programming skills. All of the Stata code is provided (with corresponding Python and R code available online), so students of all programming abilities can focus on understanding and interpreting the analyses.
This chapter aims to prepare the reader for the models, applications, lab work, and mini case studies in the coming chapters. The focus is on sample selection, identification strategy, and hypothesis development. The chapter first covers some terminology and then discusses data types, units of analysis, data management, and different sampling methods. The sample-selection part explores the steps in a well-structured sample design. The identification strategy part covers the causal relationship of interest, ideal experiments, and statistical inference. This part is of particular significance because, in corporate finance research, it is important that the hypothesis is closely tied to economic theory and the previous literature. It is only then that we can draw meaningful conclusions from the studied relationships and deductions follow from hypotheses. The chapter ends with a hypothesis development section that details some decision/rejection rules. Stata codes are provided for the examples.
The health and well-being of families is an important consideration for federal, state, and/or local levels of government. Family health policies based on recent knowledge of early childhood development have evolved to emphasise the importance of providing every child with the best possible start to life. Childhood sets the foundation for future health and well-being and is recognised by the 1979 United Nations Convention on the Rights of the Child. To impact health inequalities, government policies and services must address the social determinants of early child health, development and well-being.
A masters-level overview of the mathematical concepts needed to master the art of derivatives pricing, this textbook is a must-have for anyone considering a career in quantitative finance in industry or academia. Starting from the foundations of probability, the book allows students with limited technical background to build a solid knowledge base of the most important notions. It offers a unique compromise between intuition and mathematics, even when discussing abstract notions such as change of measure. Mathematical concepts are initially introduced using “toy” examples, before moving on to examples of finance cases, in both discrete and continuous time. Throughout, numerical applications and simulations illuminate the analytical results. The end-of-chapter exercises test students’ understanding, with solved exercises at the end of each part to aid self-study. Additional resources are available online, including slides, code, and an interactive app.
This chapter explores the relationship between primary health care (PHC), health literacy and health education with empowering individuals, groups and communities to improve and maintain optimum health. PHC philosophy encompasses principles of accessibility, affordability, sustainability, social justice and equity, self-determination, community participation and intersectoral collaboration, which drive health care service delivery and health care reform. Empowerment is a fundamental component of social justice, which seeks to redistribute power so those who are disadvantaged can have more control of the factors that influence their lives. Lack of empowerment is linked to poorer health outcomes due to limited control or agency, associated with poorer social determinants of health. This influences personal resources, agency and participation, as well as limited capacity to access services and opportunities. Health care professionals and systems need to work in ways to promote the empowerment of individuals, groups and communities to achieve better health outcomes.