To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Rolf Fare, Southern Illinois University, Carbondale,Shawna Grosskopf, Southern Illinois University, Carbondale,C. A. Knox Lovell, University of North Carolina, Chapel Hill
Rolf Fare, Southern Illinois University, Carbondale,Shawna Grosskopf, Southern Illinois University, Carbondale,C. A. Knox Lovell, University of North Carolina, Chapel Hill
Rolf Fare, Southern Illinois University, Carbondale,Shawna Grosskopf, Southern Illinois University, Carbondale,C. A. Knox Lovell, University of North Carolina, Chapel Hill
In each of the earlier chapters in this monograph, we have focused on a single general topic. In contrast, this chapter contains several topics, all of which allow for customization of the models already introduced in this manuscript. As we shall see, the programming framework readily allows such customization – additional constraints may be added in a straightforward way, and behavioral objectives also may be modified readily, allowing us to identify the frontier in nontraditional or restricted cases and to measure easily deviations from those frontiers. These topics are intended to be suggestive and not exhaustive; they are an invitation to the reader to extend the models introduced here to suit their area of interest.
In Section 10.1 we discuss subvector efficiency. By subvector efficiency we understand that only some inputs or some outputs are exposed to scaling while the others are fixed. This allows us to consider short-run efficiency, for example.
Constrained profit maximization is the topic of Section 10.2. In this section we demonstrate how to formulate profit maximization in a programming framework, and how to modify the basic problem to account for fixed inputs as well as for constraints which limit expenditure on a subvector of inputs. An example of this type of constraint is a credit constraint faced by a producer purchasing variable inputs.
Measures of plant capacity and its utilization are introduced in Section 10.3. Here we investigate capacity in terms of inputs and outputs rather than in terms of cost, in order to provide an operational measure of the notion of capacity as defined by Johansen (1968), which seeks maximum achievable output given fixed inputs, but allowing unrestricted application of variable inputs.
Rolf Fare, Southern Illinois University, Carbondale,Shawna Grosskopf, Southern Illinois University, Carbondale,C. A. Knox Lovell, University of North Carolina, Chapel Hill
Rolf Fare, Southern Illinois University, Carbondale,Shawna Grosskopf, Southern Illinois University, Carbondale,C. A. Knox Lovell, University of North Carolina, Chapel Hill
In Chapter 3 we modeled technology in terms of the input correspondence and measured efficiency relative to the input set, i.e., output quantities were taken as given and inefficiency identified by feasible reductions in input quantities or cost. In this chapter we measure efficiency relative to the output set P(x), i.e., we take input quantities as given and judge performance by the ability to increase output quantities or revenue. As such the topic of this chapter is very much in the spirit of the neoclassical production functions defined as maximum achievable output given input quantities and technology, although we generalize here to the case of multiple rather than scalar output.
In Section 4.1 output-based measures which are independent of prices, i.e., technical in nature, are introduced, and we show how overall technical efficiency can be decomposed into three component measures – scale, congestion, and purely technical efficiency. All of these measures of technical efficiency take input quantities as given and measure efficiency as feasible proportional expansion of all outputs.
In Section 4.2 we turn to output price-dependent measures of efficiency, the goal being to maximize revenue rather than to proportionally increase outputs. Here it becomes relevant to alter the output mix in light of existing output prices. In particular, we show how to decompose overall revenue efficiency (defined as the ratio of maximum to observed revenue) into technicaland allocative components.
Sections 4.1 and 4.2 focus on what we call radial measures of output efficiency. One of the drawbacks of these radial measures is that they project an observation onto the isoquant of the output set, and not necessarily onto the efficient subset of the output set.
Among world equity markets, the largest is the Tokyo Stock Exchange, the second is the New York Stock Exchange, the third is the US NASDAQ market, and the fourth will be the JASDAQ market which Japan is going to organize and start in 1992, exploring the possibility of connecting computers with NASDAQ. In recent years, therefore, Japanese investment in the US and American investment in Japan have been important activities in rapidly integrating world capital markets. Professor William T. Ziemba presented a timely and significant study, entitled "Currency Hedging Strategies" at the Conference on Financial Optimization held at The Wharton School of the University of Pennsylvania on 10 November 1989. His principal conclusion is as follows:
For the American investing in Japan, the hedge provides a substantial bonus: an essentially risk free gain of about 3-5% per year due to the difference in interest rates between the two countries. … The situation is much more difficult and complicated for Japanese investment in the US. The forward/futures hedge will eliminate the currency risk but at a cost of about 3-5% per year.
In his chapter, he explored mainly the latter case, upon which I will comment.
In this chapter we address the use of optimization models in financial engineering. We give this term to the process of creating new packages of old risk attributes. By repackaging and stripping risk attributes from existing instruments the financial engineer improves the marketability of the products and fits the needs of individual investors. Financial engineering also takes a service role in financial operations. For example, in order to obtain an AAA rating on an issue certain properties should hold under both best- and worst-case scenarios. Optimization models could automate the process of analyzing the scenarios.
This chapter is organized as follows: section 2 discusses three models from financial engineering and section 3 provides a brief overview of existing solution methodologies. Emphasis is placed in this section on the availability and capabilities of software for the solution of the optimization models presented earlier. It aims to convey a first exposure of the techniques to financial analysts, and may have little to offer to an operations research expert. An appendix provides an optimization model for the estimation of the term structure of interest rates, and explains Monte Carlo simulation techniques for generating interest-rate scenarios. Models like the one described here are often used to generate key input data to several of the optimization models discussed in both chapter 1 and this one.
The service sector in the United States constitutes approximately 60% of GNP and over 70% of employment; the situation in Japan and Western Europe is similar. In addition, this sector consumes over 75% of all information technology investments. However, this investment in the largest component of the economy has not produced large, measurable improvements in productivity (causing some authors to dub this investment the “service sector sinkhole”). While productivity and quality in manufacturing remain a major concern for academia, industry, and government, the sheer size of and major productivity issues in the service sector demand attention.
The purpose of the Fishman-Davidson Center for the Study of the Service Sector at The Wharton School is to attack the above mentioned issues through a truly interdisciplinary approach. Founded in 1984, the Center is the only organization of its kind which is devoted to the full range of issues facing the service sector. The conference from which the current volume arose is an excellent example of the type of interdisciplinary approach which must be pursued if the “sinkhole” is to be avoided. As financial institutions move into a new era defined by both technological and regulatory advances, the best and brightest from finance, management science, computer science, and other fields are needed to improve the productivity of this sector as well as the ultimate value which is provided to the consumer.
In November 1989 a Conference took place at The Wharton School, University of Pennsylvania on the topic of financial optimization. Several distinguished speakers from academia and industry presented that state-of-the-art in modeling and solving a variety of problems that are central to the financial services industry.
The objectives of the conference are best described by introducing the two keynote speakers: Professor George B. Dantzig from Stanford University and Professor Harry M. Markowitz from Baruch College of the City University of New York. These are the founding fathers of two diverse disciplines. Dantzig is credited with the origins of management science through his pioneering work with linear programming in the 1940s. This is a discipline that has touched many endeavors of our society: transportation, manufacturing, the military, economics, and financial services. Markowitz is credited with the foundations of modern portfolio theory for his seminal contribution in risk management through mean-variance models in 1952. His contribution has touched — over the years — most practitioner and academic studies for portfolio management and provided the basis for the derivation of equilibrium models for the financial markets. It was recognized by a Nobel prize in economics in 1990.
The aim of the conference was to bring together these two disciplines, and survey the current state of interaction between them.
Asay, Bouyoucos, and Marciano (ABM) have applied the same technology that has been successful in the financial valuation of callable bonds and mortgage-backed securities (MBS) to the valuation of single premium deferred annuities (SPDAs). The application of this technology to insurance product valuations is natural, and long overdue, as it engenders an understanding of the economic importance of policy options that traditional models have heretofore not captured well.
My remarks concerning the ABM study cover three areas: possible extensions to the ABM approach, practical considerations with regard to the interest rates and associated cashflows used in setting up the binomial tree, and potential misapplications of the ABM approach in portfolio structuring.
People familiar with the MBS valuation models used by Wall Street firms will recognize certain buzz words used by ABM and understand at once the particular version of the model that was used. However, it may prove helpful to offer a clarifying comment for the reader less familiar with the extant models.
The 1980s have been a trying decade for life insurers. Volatile interest rates, new competition from other sectors of the financial services industry, and tightening investment spreads are some of the major new forces that have challenged the industry. The response has been to offer new products (the so-called “interest-sensitive” products), to promise higher rates, and to assume investment risks associated with higher book yields. The combination of these three actions has placed the industry in a precarious position as it faces the new decade.
Figure 5.1 shows interest rates during the 1980s for both six-month Treasury bills and thirty-year Treasury bonds. To help recap the developments that have shaped the industry, we have included three milestones (labeled A, B, and C). The first milestone (A) occurred during the early part of the decade when short-term rates shot to record heights. This caused disintermediation as policyholders fled to higher yielding alternatives offered in the capital markets (e.g., money market mutual funds). Associated with this flight to yield was the need for insurers to liquidate assets at a loss to meet the outflow. Policyholders' exercise of the options to surrender their policies, or to take out policy loans at substantially below-market interest rates, caused insurers both economic and accounting losses.
Since the early seventies the domain of financial operations witnessed a significant transformation. The breakdown of the Bretton Woods Agreement, coupled with a liberalization of the financial markets and the inflation and oil crisis of the same time, led to increased volatility of interest rates. The environment of fixed-income securities, where private and corporate investors, insurance, and pension fund managers would turn for secure investments, became more volatile than the stock market. The fluctuation of bonds increased sharply after October 1979 when the Federal Reserve Bank adopted a policy allowing wider moves in short-term interest rates. According to the volatility indexes, compiled by Shearson Lehman Economics, bonds were more volatile than stocks by a factor of seven in the early eighties.
Uncertainty breeds creativity, but so does a dynamic market where intelligent answers to complex problems are rewarded immediately. As a result we have seen an increased use of advanced analytic techniques in the form of optimization models for many diverse aspects of financial operations. Several theoretical developments provided the building blocks on which an analyst could base a comprehensive planning model. Models for the estimation of the term structure of interest rates, the celebrated Black—Scholes formula for valuating options, and other complex instruments, were added to the long list of contributions since Markowitz's seminal work on mean-variance analysis for stock returns in 1952.
The continuously increasing competitiveness and complexity of the fixed-income markets have been a driving force in the use of sophisticated quantitative techniques in financial planning applications.
Not long ago, investors purchased securities or portfolios of securities with the implicit assumption that the securities would be held to maturity and that interest rates would remain stable. In this environment, investment decisions were based primarily on credit quality and yield-to-maturity. To a large extent, interest-rate risk was disregarded. As a result, when rates rose dramatically in the late 1970s and early 1980s, investors were faced with immense interest-rate risk exposure. In response, immunization techniques, such as cashflow matching, duration matching, and portfolio insurance, were developed and were proved to be sufficient.
However, the dramatic growth of the mortgage sector, the introduction of sophisticated new securities and hedging instruments, and increasing interest-rate volatility have forced portfolio managers to develop more complex mathematical techniques that are capable of addressing long-term investment concerns under a variety of interest-rate environments.
Traditional portfolio-management techniques do not provide an efficient solution to the problems associated with actively managing a portfolio in the current market environment. Portfolios that are simply duration-matched cannot achieve the desired tradeoff between risk and reward in volatile environments. Conventional investment parameters, such as yield-to-maturity, duration, and convexity, are insufficient to evaluate a portfolio adequately.
Investment to and from Japan can be very profitable. For example, a dollar invested in the Nikkei Stock Average in 1949 was worth over $500 at the end of 1989. In recent years there has been much Japanese investment in the US, particularly in bonds, stocks, and real estate. The drop in the yen dollar rate from the 260 range in the fall of 1985 to the 120 range at the end of 1987 and its sharp rise back to 160 in mid 1989 and back under 125 in early 1992 shows the extreme risk involved in these investments. This chapter investigates currency hedging strategies for Japanese investors making investments in US assets and Americans investing in Japan. The traditional approach is to fully eliminate the currency risk using forward or futures contracts to offset the long exposure to the foreign currency. For the American investing in Japanese stocks the hedge often provides a bonus: an essentially risk-free gain of 1–4% per year due to the difference in interest rates between the two countries. Although improvements adding risk are conceivably possible this approach is a very satisfactory resolution of this problem. The situation has been much more difficult and complicated for Japanese investment in the US.
The market for fixed-income securities has changed dramatically over the last decade. Two of the most significant changes have been the unprecedented volatility of interest rates and the introduction of mortgage-backed securities. The mortgage market has grown in the last decade from virtually nothing to become a substantial segment of the fixed-income market. There are nearly $1 trillion of these securities outstanding, or about half the total of US Treasury and Agency securities. See Person (1989), for example.
These developments have led to substantial innovations in the techniques used to analyze the values of fixed-income securities. Specifically, a collection of techniques known as option-pricing methods has come to be widely applied. This approach was developed originally to evaluate stock options, hence the name, but in fact it can be applied to any security whose value is affected by fluctuating economic variables.
The starting point is a stochastic-process model of the evolution of these variables over time. In fixed-income analysis the key variable is the short-term interest rate, though many models include other variables as well. Based on this model and the characteristics of the particular security being analyzed, one then derives a valuation equation. The solution of this equation provides an estimate of the value of the security under various conditions. A full exposition of this approach can be found in Ingersoll (1987), for example.