Paper
15 October 2013 Surveillance in long-distance turbulence-degraded videos
Author Affiliations +
Abstract
Surveillance in long-distance turbulence-degraded video is a difficult challenge because of the effects of the atmospheric turbulence that causes blur and random shifts in the image. As imaging distances increase, the degradation effects become more significant. This paper presents a method for surveillance in long-distance turbulence-degraded videos. This method is based on employing new criteria for discriminating true from false object detections. We employ an adaptive thresholding procedure for background subtraction, and implement new criteria for distinguishing true from false moving objects, that take into account the temporal consistency of both shape and motion properties. Results show successful detection also tracking of moving objects on challenging video sequences, which are significantly distorted with atmospheric turbulence. However, when the imaging distance is increased higher false alarms may occur. The method presented here is relatively efficient and has low complexity.
© (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yitzhak Yitzhaky, Eli Chen, and Oren Haik "Surveillance in long-distance turbulence-degraded videos", Proc. SPIE 8897, Electro-Optical Remote Sensing, Photonic Technologies, and Applications VII; and Military Applications in Hyperspectral Imaging and High Spatial Resolution Sensing, 889704 (15 October 2013); https://doi.org/10.1117/12.2028853
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Video

Video surveillance

Surveillance

Atmospheric turbulence

Image restoration

Atmospheric sensing

Detection and tracking algorithms

Back to Top