Adaptive clutter suppression and detection filters provide increased target and background discrimination capability to an IRST surveillance system by dynamically tracking and suppressing the background clutter environment. Simulation results with real IRST data are presented, both for single pixel and 3X3 target point spread functions, p.s.f.'s, for various sizes and applied in various order adaptive temporal/spatial/spectral filters, and suboptimal Markov, sparse covariance and LMS spatial filters. The filters are subsequently traded off in terms of performance and implementation complexity. Topics covered include a discussion of the normalized crosscorrelation technique employed for frame-frame registration prior to temporal and multispectral filtering and a method for counteracting the often encountered ill-conditioning of the clutter covariance matrix. Additionally, for the case of correlated background clutter, an expression is presented for the normalizer constant-false-alarm-rate, CFAR, loss as a function of desired false alarm rate and the expected mean and variance of the adaptive threshold estimator. In particular, for the case where clutter is distributed as a first order Markov process, the normalizer loss can be evaluated as a function of specified false alarm rate, background pixel-to-pixel azimuth and elevation correlation coefficients and normalizer shape and size. Techniques for reducing the normalizer CFAR loss are presented including decorrelating the data in the normalizer window, applying better detection filters and changing normalizer shape and size. These techniques are analyzed in terms of performance and implementation complexity.