There is an increasing interest for image motion analysis, driven by several application fields ranging from dynamic scene analysis, up to image encoding for transmission purposes. In the context of scene analysis, the image motion information is of crucial help for segmentation and qualitative interpretation. In the case of unstructured, inhomogeneous images, motion information is carried by the spatio-temporal variations of the light intensity function. The apparent distribution inferred from this information is a dense vector field, called optical flow. Assuming the spatial continuity of the field to be estimated, a local determination of optical flow is possible. The main difficulty lies in the handling of motion discontinuities. Usually, the localization of motion frontiers is handled as a binary problem, and thus leads to instabilities during the estimation process. We propose an incremental process, evaluating the optical flow from a sequence of images, using temporal Kalman filtering. Our approach is based on a continuous handling of motion frontiers. The evolution model acts as a temporal low band filter on the estimated field. To cope with motion discontinuities, the filter is continuously adapted according to local motion homogeneity, via the covariance of the model noise. The result is a progressive cancelation of temporal regularization in the neighborhood of motion frontiers, allowing a better convergence of the filter in case of such a discontinuity. Such a continuous handling of motion frontiers leads to a great robustness in the estimation process.