Estimating the motion of moving targets from a moving platform is an extremely challenging problem in un-manned systems research. One common and often successful approach is to use optical flow for motion estimation to account for ego-motion of the platform and to then track the motion of surrounding objects. However, in the presence of video degradation such as noise, compression artifacts, and reduced frame rates, the performance of
state-of-the-art optical flow algorithms greatly diminishes. We consider the effects of video degradation on two well-known optical flow datasets as well as on a real-world video data. To highlight the need for robust optical flow algorithms in the presence of real-world conditions, we present both qualitative and quantitative results on
Josh Harguess, Chris Barngrover, and Amin Rahimi, "An analysis of optical flow on real and simulated data with degradations," Proc. SPIE 10199, Geospatial Informatics, Fusion, and Motion Video Analytics VII, 1019905 (Presented at SPIE Defense + Security: April 12, 2017; Published: 1 May 2017); https://doi.org/10.1117/12.2265850.
Conference Presentations are recordings of oral presentations given at SPIE conferences and published as part of the conference proceedings. They include the speaker's narration along with a video recording of the presentation slides and animations. Many conference presentations also include full-text papers. Search and browse our growing collection of more than 12,000 conference presentations, including many plenary and keynote presentations.