From an applications viewpoint, the main reason to study the subject of this book is to help deal with the complexity of describing random, time-varying functions. A random variable can be interpreted as the result of a single measurement. The distribution of a single random variable is fairly simple to describe. It is completely specified by the cumulative distribution function F(x), a function of one variable. It is relatively easy to approximately represent a cumulative distribution function on a computer. The joint distribution of several random variables is much more complex, for in general it is described by a joint cumulative probability distribution function, F(x1, x2, …, xn), which is much more complicated than n functions of one variable. A random process, for example a model of time-varying fading in a communication channel, involves many, possibly infinitely many (one for each time instant t within an observation interval) random variables. Woe the complexity!
This book helps prepare the reader to understand and use the following methods for dealing with the complexity of random processes:
• Work with moments, such as means and covariances.
• Use extensively processes with special properties. Most notably, Gaussian processes are characterized entirely by means and covariances, Markov processes are characterized by one-step transition probabilities or transition rates, and initial distributions. Independent increment processes are characterized by the distributions of single increments.
• Appeal to models or approximations based on limit theorems for reduced complexity descriptions, especially in connection with averages of independent, identically distributed random variables. The law of large numbers tells us, in a certain sense, that a probability distribution can be characterized by its mean alone. The central limit theorem similarly tells us that a probability distribution can be characterized by its mean and variance. These limit theorems are analogous to, and in fact examples of, perhaps the most powerful tool ever discovered for dealing with the complexity of functions: Taylor's theorem, in which a function in a small interval can be approximated using its value and a small number of derivatives at a single point.
[…]