We have studied the following problem: An image of a ground scene drifts over a mosaic sensor at a constant drift velocity; given the sensor output at one time, develop an algorithm to predict the sensor output at a nearby time. Such an algorithm allows us to compensate for drift of the background scene between frames of data from the sensor. The mathematical basis of our algorithm is an expansion of the continuous physical radiation intensity on the sensor in terms of the digitized sensor output; the process can be viewed as a generalized interpolation procedure that gives rise to a time-dependent spatial filter with one adjustable parameter. The usefulness of drift compensation in background suppression is evident; for example, we can form conventional first or second difference plots using drift-compensated frames, which should yield quite small differences. If a fairly rapidly moving target is present in the scene it is not suppressed to the same extent as the background, so the ratio of target to background is increased in a difference plot. We have simulated a series of experiments on background suppression using our drift compensation algorithm. Performance has been studied for various values of drift velocity, target velocity, sensor noise, and the adjustable parameter of the algorithm. Our conclusion is that background suppression is in fact much greater with drift compensation than without, based on first and second differences; depending upon the target velocity, drift velocity, and sensor noise that one must deal with, it appears possible to achieve about 1 or 2 orders of magnitude improvement.