A fundamental problem in engineering is estimation (prediction) of an outcome of an unobserved random variable based on outcomes of a set of observed random variables. In the context of random processes, we wish to estimate values of a random function Y(s) based on observation of a random function X(t). From a filtering perspective, we desire a system which, given an input X(t), produces an output Å¶(s) that best estimates Y(s), where the goodness of the estimator is measured by a probabilistic error measure between the estimator and the random variable it estimates.
If such an estimation rule Ï can be found, then Ï(X) is called an optimal mean-square-error estimator of Y in terms of X. To make the estimation problem mathematically or computationally tractable, or to obtain an estimation rule with desirable properties, we often restrict the class of estimation rules over which the MSE minimum is to be achieved. The constraint trade-off is a higher MSE in return for a tractable design or desirable filter properties. In this section we examine the theoretically best solution over all functions of X. Subsequently, we focus on linear estimation in order to discover systems that provide optimal linear filtering. The theory, both linear and nonlinear, applies to complex random variables; however, our exposition assumes that all random variables are real valued.
Online access to SPIE eBooks is limited to subscribing institutions.