This paper is a continuation of the work of Becker et al.1 In their work, they analyzed the robustness of various background subtraction algorithms on fused video streams originating from visible and infrared cameras. In order to cover a broader range of background subtraction applications, we show the effects of fusing infrared-visible video streams from vibrating cameras on a large set of background subtraction algorithms. The effectiveness is quantitatively analyzed on recorded data of a typical outdoor sequence with a fine-grained and accurate annotation of the images. Thereby, we identify approaches which can benefit from fused sensor signals with camera jitter. Finally conclusions on what fusion strategies should be preferred under such conditions are given.
Stefan Becker, Norbert Scherer-Negenborn, Pooja Thakkar, Wolfgang Hübner, and Michael Arens, "The effects of camera jitter for background subtraction algorithms on fused infrared-visible video streams," Proc. SPIE 9995, Optics and Photonics for Counterterrorism, Crime Fighting, and Defence XII, 99950I (Presented at SPIE Security + Defence: September 27, 2016; Published: 24 October 2016); https://doi.org/10.1117/12.2239884.
Conference Presentations are recordings of oral presentations given at SPIE conferences and published as part of the conference proceedings. They include the speaker's narration along with a video recording of the presentation slides and animations. Many conference presentations also include full-text papers. Search and browse our growing collection of more than 14,000 conference presentations, including many plenary and keynote presentations.
Study of self-shadowing effect as a simple means to realize nanostructured thin films and layers with special attentions to birefringent obliquely deposited thin films and photo-luminescent porous silicon