15 November 2017 Coarse-to-fine deep neural network for fast pedestrian detection
Author Affiliations +
Proceedings Volume 10605, LIDAR Imaging Detection and Target Recognition 2017; 106052H (2017) https://doi.org/10.1117/12.2293435
Event: LIDAR Imaging Detection and Target Recognition 2017, 2017, Changchun, China
Abstract
Pedestrian detection belongs to a category of object detection is a key issue in the field of video surveillance and automatic driving. Although recent object detection methods, such as Fast/Faster RCNN, have achieved excellent performance, it is difficult to meet real-time requirements and limits the application in real scenarios. A coarse-to-fine deep neural network for fast pedestrian detection is proposed in this paper. Two-stage approach is presented to realize fine trade-off between accuracy and speed. In the coarse stage, we train a fast deep convolution neural network to generate most pedestrian candidates at the cost of a number of false positives. The detector can cover the majority of scales, sizes, and occlusions of pedestrians. After that, a classification network is introduced to refine the pedestrian candidates generated from the previous stage. Refining through classification network, most of false detections will be excluded easily and the final pedestrian predictions with bounding box and confidence score are produced. Competitive results have been achieved on INRIA dataset in terms of accuracy, especially the method can achieve real-time detection that is faster than the previous leading methods. The effectiveness of coarse-to-fine approach to detect pedestrians is verified, and the accuracy and stability are also improved.
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yaobin Li, Yaobin Li, Xinmei Yang, Xinmei Yang, Lijun Cao, Lijun Cao, } "Coarse-to-fine deep neural network for fast pedestrian detection", Proc. SPIE 10605, LIDAR Imaging Detection and Target Recognition 2017, 106052H (15 November 2017); doi: 10.1117/12.2293435; https://doi.org/10.1117/12.2293435
PROCEEDINGS
7 PAGES


SHARE
Back to Top