We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Focused on empirical methods and their applications to corporate finance, this innovative text equips students with the knowledge to analyse and critically evaluate quantitative research methods in corporate finance, and conduct computer-aided statistical analyses on various types of datasets. Chapters demonstrate the application of basic econometric models in corporate finance (as opposed to derivations or theorems), backed up by relevant research. Alongside practical examples and mini case studies, computer lab exercises enable students to apply the theories of corporate finance and make stronger connections between theory and practice, while developing their programming skills. All of the Stata code is provided (with corresponding Python and R code available online), so students of all programming abilities can focus on understanding and interpreting the analyses.
Focusing on methods for data that are ordered in time, this textbook provides a comprehensive guide to analyzing time series data using modern techniques from data science. It is specifically tailored to economics and finance applications, aiming to provide students with rigorous training. Chapters cover Bayesian approaches, nonparametric smoothing methods, machine learning, and continuous time econometrics. Theoretical and empirical exercises, concise summaries, bolded key terms, and illustrative examples are included throughout to reinforce key concepts and bolster understanding. Ancillary materials include an instructor's manual with solutions and additional exercises, PowerPoint lecture slides, and datasets. With its clear and accessible style, this textbook is an essential tool for advanced undergraduate and graduate students in economics, finance, and statistics.
Applied econometrics uses the tools of theoretical econometrics and real-word data to develop predictive models and assess economic theories. Due to the complex nature of such analysis, various assumptions are often not understood by those people who rely on it. The danger of this is that economic policies can be assessed favourably to suit a particular political agenda and forecasts can be generated to match the needs of a particular customer. Ethics in Econometrics argues that econometricians need to be aware of potential ethical pitfalls when carrying out their analysis and that they need to be encouraged to avoid them. Using a range of empirical examples and detailed discussions of real cases, this book provides a guide for research practices in econometrics, illustrating why it is imperative that econometricians act ethically in terms of the way they conduct their analysis and treat their data.
Focusing on the physics of the catastrophe process and addressed directly to advanced students, this innovative textbook quantifies dozens of perils, both natural and man-made, and covers the latest developments in catastrophe modelling. Combining basic statistics, applied physics, natural and environmental sciences, civil engineering, and psychology, the text remains at an introductory level, focusing on fundamental concepts for a comprehensive understanding of catastrophe phenomenology and risk quantification. A broad spectrum of perils are covered, including geophysical, hydrological, meteorological, climatological, biological, extraterrestrial, technological and socio-economic, as well as events caused by domino effects and global warming. Following industry standards, the text provides the necessary tools to develop a CAT model from hazard to loss assessment. Online resources include a CAT risk model starter-kit and a CAT risk modelling 'sandbox' with Python Jupyter tutorial. Every process, described by equations, (pseudo)codes and illustrations, is fully reproducible, allowing students to solidify knowledge through practice.
Actuaries must pass exams, but more than that: they must put knowledge into practice. This coherent book supports the Society of Actuaries' short-term actuarial mathematics syllabus while emphasizing the concepts and practical application of nonlife actuarial models. A class-tested textbook for undergraduate courses in actuarial science, it is also ideal for those approaching their professional exams. Key topics covered include loss modelling, risk and ruin theory, credibility theory and applications, and empirical implementation of loss models. Revised and updated to reflect curriculum changes, this second edition includes two brand new chapters on loss reserving and ratemaking. R replaces Excel as the computation tool used throughout – the featured R code is available on the book's webpage, as are lecture slides. Numerous examples and exercises are provided, with many questions adapted from past Society of Actuaries exams.
Financial models are an inescapable feature of modern financial markets. Yet it was over reliance on these models and the failure to test them properly that is now widely recognized as one of the main causes of the financial crisis of 2007–2011. Since this crisis, there has been an increase in the amount of scrutiny and testing applied to such models, and validation has become an essential part of model risk management at financial institutions. The book covers all of the major risk areas that a financial institution is exposed to and uses models for, including market risk, interest rate risk, retail credit risk, wholesale credit risk, compliance risk, and investment management. The book discusses current practices and pitfalls that model risk users need to be aware of and identifies areas where validation can be advanced in the future. This provides the first unified framework for validating risk management models.
This well-balanced introduction to enterprise risk management integrates quantitative and qualitative approaches and motivates key mathematical and statistical methods with abundant real-world cases - both successes and failures. Worked examples and end-of-chapter exercises support readers in consolidating what they learn. The mathematical level, which is suitable for graduate and senior undergraduate students in quantitative programs, is pitched to give readers a solid understanding of the concepts and principles involved, without diving too deeply into more complex theory. To reveal the connections between different topics, and their relevance to the real world, the presentation has a coherent narrative flow, from risk governance, through risk identification, risk modelling, and risk mitigation, capped off with holistic topics - regulation, behavioural biases, and crisis management - that influence the whole structure of ERM. The result is a text and reference that is ideal for graduate and senior undergraduate students, risk managers in industry, and anyone preparing for ERM actuarial exams.
This essential reference for students and scholars in the input-output research and applications community has been fully revised and updated to reflect important developments in the field. Expanded coverage includes construction and application of multiregional and interregional models, including international models and their application to global economic issues such as climate change and international trade; structural decomposition and path analysis; linkages and key sector identification and hypothetical extraction analysis; the connection of national income and product accounts to input-output accounts; supply and use tables for commodity-by-industry accounting and models; social accounting matrices; non-survey estimation techniques; and energy and environmental applications. Input-Output Analysis is an ideal introduction to the subject for advanced undergraduate and graduate students in many scholarly fields, including economics, regional science, regional economics, city, regional and urban planning, environmental planning, public policy analysis and public management.
Bayesian Econometric Methods examines principles of Bayesian inference by posing a series of theoretical and applied questions and providing detailed solutions to those questions. This second edition adds extensive coverage of models popular in finance and macroeconomics, including state space and unobserved components models, stochastic volatility models, ARCH, GARCH, and vector autoregressive models. The authors have also added many new exercises related to Gibbs sampling and Markov Chain Monte Carlo (MCMC) methods. The text includes regression-based and hierarchical specifications, models based upon latent variable representations, and mixture and time series specifications. MCMC methods are discussed and illustrated in detail - from introductory applications to those at the current research frontier - and MATLAB® computer programs are provided on the website accompanying the text. Suitable for graduate study in economics, the text should also be of interest to students studying statistics, finance, marketing, and agricultural economics.
The substantially updated third edition of the popular Actuarial Mathematics for Life Contingent Risks is suitable for advanced undergraduate and graduate students of actuarial science, for trainee actuaries preparing for professional actuarial examinations, and for life insurance practitioners who wish to increase or update their technical knowledge. The authors provide intuitive explanations alongside mathematical theory, equipping readers to understand the material in sufficient depth to apply it in real-world situations and to adapt their results in a changing insurance environment. Topics include modern actuarial paradigms, such as multiple state models, cash-flow projection methods and option theory, all of which are required for managing the increasingly complex range of contemporary long-term insurance products. Numerous exam-style questions allow readers to prepare for traditional professional actuarial exams, and extensive use of Excel ensures that readers are ready for modern, Excel-based exams and for the actuarial work environment. The Solutions Manual (ISBN 9781108747615), available for separate purchase, provides detailed solutions to the text's exercises.
Doubt over the trustworthiness of published empirical results is not unwarranted and is often a result of statistical mis-specification: invalid probabilistic assumptions imposed on data. Now in its second edition, this bestselling textbook offers a comprehensive course in empirical research methods, teaching the probabilistic and statistical foundations that enable the specification and validation of statistical models, providing the basis for an informed implementation of statistical procedure to secure the trustworthiness of evidence. Each chapter has been thoroughly updated, accounting for developments in the field and the author's own research. The comprehensive scope of the textbook has been expanded by the addition of a new chapter on the Linear Regression and related statistical models. This new edition is now more accessible to students of disciplines beyond economics and includes more pedagogical features, with an increased number of examples as well as review questions and exercises at the end of each chapter.
How to Divide When There Isn't Enough develops a rigorous yet accessible presentation of the state-of-the-art for the adjudication of conflicting claims and the theory of taxation. It covers all aspects one may wish to know about claims problems: the most important rules, the most important axioms, and how these two sets are related. More generally, it also serves as an introduction to the modern theory of economic design, which in the last twenty years has revolutionized many areas of economics, generating a wide range of applicable allocations rules that have improved people's lives in many ways. In developing the theory, the book employs a variety of techniques that will appeal to both experts and non-experts. Compiling decades of research into a single framework, William Thomson provides numerous applications that will open a large number of avenues for future research.
Interest in nonparametric methodology has grown considerably over the past few decades, stemming in part from vast improvements in computer hardware and the availability of new software that allows practitioners to take full advantage of these numerically intensive methods. This book is written for advanced undergraduate students, intermediate graduate students, and faculty, and provides a complete teaching and learning course at a more accessible level of theoretical rigor than Racine's earlier book co-authored with Qi Li, Nonparametric Econometrics: Theory and Practice (2007). The open source R platform for statistical computing and graphics is used throughout in conjunction with the R package np. Recent developments in reproducible research is emphasized throughout with appendices devoted to helping the reader get up to speed with R, R Markdown, TeX and Git.
This book offers an up-to-date, comprehensive coverage of stochastic dominance and its related concepts in a unified framework. A method for ordering probability distributions, stochastic dominance has grown in importance recently as a way to measure comparisons in welfare economics, inequality studies, health economics, insurance wages, and trade patterns. Whang pays particular attention to inferential methods and applications, citing and summarizing various empirical studies in order to relate the econometric methods with real applications and using computer codes to enable the practical implementation of these methods. Intuitive explanations throughout the book ensure that readers understand the basic technical tools of stochastic dominance.
Econometrics can at first appear a highly technical subject, but it can also equip the practitioner with a useful skillset of smart ways to formulate research questions and collect data. Enjoyable Econometrics applies econometric methods to a variety of unusual and engaging research questions, often beyond the realm of economics, demonstrating the great potential of using such methods to understand a wide range of phenomena. Unlike the typical textbook approach, Enjoyable Econometrics follows in the footsteps of Freakonomics by posing interesting questions first before introducing the methodology to find the answers. Therefore, rather than equation-heavy sections based around complex methodologies, the reader is presented with chapters on 'Money' and 'Fashion, Art and Music'. Franses writes in a way that will enthuse and motivate the economics student embarking upon the essential study of econometrics. Indeed, the book shows that econometric methods can be applied to almost anything.
With a new author team contributing decades of practical experience, this fully updated and thoroughly classroom-tested second edition textbook prepares students and practitioners to create effective forecasting models and master the techniques of time series analysis. Taking a practical and example-driven approach, this textbook summarises the most critical decisions, techniques and steps involved in creating forecasting models for business and economics. Students are led through the process with an entirely new set of carefully developed theoretical and practical exercises. Chapters examine the key features of economic time series, univariate time series analysis, trends, seasonality, aberrant observations, conditional heteroskedasticity and ARCH models, non-linearity and multivariate time series, making this a complete practical guide. Downloadable datasets are available online.
Actuaries have access to a wealth of individual data in pension and insurance portfolios, but rarely use its full potential. This book will pave the way, from methods using aggregate counts to modern developments in survival analysis. Based on the fundamental concept of the hazard rate, Part I shows how and why to build statistical models, based on data at the level of the individual persons in a pension scheme or life insurance portfolio. Extensive use is made of the R statistics package. Smooth models, including regression and spline models in one and two dimensions, are covered in depth in Part II. Finally, Part III uses multiple-state models to extend survival models beyond the simple life/death setting, and includes a brief introduction to the modern counting process approach. Practising actuaries will find this book indispensable, and students will find it helpful when preparing for their professional examinations.
Random set theory is a fascinating branch of mathematics that amalgamates techniques from topology, convex geometry, and probability theory. Social scientists routinely conduct empirical work with data and modelling assumptions that reveal a set to which the parameter of interest belongs, but not its exact value. Random set theory provides a coherent mathematical framework to conduct identification analysis and statistical inference in this setting and has become a fundamental tool in econometrics and finance. This is the first book dedicated to the use of the theory in econometrics, written to be accessible for readers without a background in pure mathematics. Molchanov and Molinari define the basics of the theory and illustrate the mathematical concepts by their application in the analysis of econometric models. The book includes sets of exercises to accompany each chapter as well as examples to help readers apply the theory effectively.
A Guide to Trade Credit Insurance' is a reference book on trade credit insurance, written from an international perspective. It is a compilation of contributions from various authors and reviewers drawn from ICISA member companies. The book provides an overview of the whole process regarding trade credit insurance, including the history of trade credit insurance, trade credit insurance providers, the underwriting process, premium calculation, claims handling, case studies and a glossary of terminology.
Structural vector autoregressive (VAR) models are important tools for empirical work in macroeconomics, finance, and related fields. This book not only reviews the many alternative structural VAR approaches discussed in the literature, but also highlights their pros and cons in practice. It provides guidance to empirical researchers as to the most appropriate modeling choices, methods of estimating, and evaluating structural VAR models. The book traces the evolution of the structural VAR methodology and contrasts it with other common methodologies, including dynamic stochastic general equilibrium (DSGE) models. It is intended as a bridge between the often quite technical econometric literature on structural VAR modeling and the needs of empirical researchers. The focus is not on providing the most rigorous theoretical arguments, but on enhancing the reader's understanding of the methods in question and their assumptions. Empirical examples are provided for illustration.