Covariance estimation is a key step in many target detection algorithms. To distinguish target from background
requires that the background be well-characterized. This applies to targets ranging from the precisely known
chemical signatures of gaseous plumes to the wholly unspecified signals that are sought by anomaly detectors.
When the background is modelled by a (global or local) Gaussian or other elliptically contoured distribution
(such as Laplacian or multivariate-t), a covariance matrix must be estimated. The standard sample covariance
overfits the data, and when the training sample size is small, the target detection performance suffers.
Shrinkage addresses the problem of overfitting that inevitably arises when a high-dimensional model is fit from
a small dataset. In place of the (overfit) sample covariance matrix, a linear combination of that covariance with a
fixed matrix is employed. The fixed matrix might be the identity, the diagonal elements of the sample covariance,
or some other underfit estimator. The idea is that the combination of an overfit with an underfit estimator
can lead to a well-fit estimator. The coefficient that does this combining, called the shrinkage parameter, is
generally estimated by some kind of cross-validation approach, but direct cross-validation can be computationally
This paper extends an approach suggested by Hoffbeck and Landgrebe, and presents efficient approximations
of the leave-one-out cross-validation (LOOC) estimate of the shrinkage parameter used in estimating the
covariance matrix from a limited sample of data.