We develop an algorithm based on a subspace model to detect anomalies in a hyperspectral image. The anomaly detector is based on the Mahalanobis distance of a residual from a pixel that is partitioned nonuniformly according to the groups in the spectral components in the pixel. The main background is removed from the pixel by predicting linear combinations of each subset of the partitioned pixel with linear combinations of the main background. The residual is defined to be the difference between the linear combinations of each subset of the partitioned pixel and the linear combinations of the main background. The anomaly detector is designed for anomalies that can be best detected in the residual of the pixel. Experimental results using two real hyperspectral images and a simulated dataset show that the anomaly detector outperforms conventional anomaly detectors.