Grain noise is one of the most common distortions in cinematographic film sequences and is caused by the crystal structure of the chemical coating of the film material. The color-sensitive crystals can be considered as three separate populations. Thus, noise in the three channels is uncorrelated and similarly noise between frames is uncorrelated. Conversely, the signal (i.e., the projected view volume) is highly correlated between channels and over time. We shall explore methods of using this constraint to reduce noise within an adaptive filter framework using the popular Widrow– Hopf least-mean-square algorithm. As a film sequence typically includes many moving elements, such as actors on a moving background, motion estimation techniques will be used to eliminate as much as possible the effect of gray-level variations on the adaptive filter. An optical-flow technique is used to extract pixel motions prior to the application of the noise reduction.