3 February 2015 A novel multi-view object recognition in complex background
Author Affiliations +
Proceedings Volume 9255, XX International Symposium on High-Power Laser Systems and Applications 2014; 92553K (2015) https://doi.org/10.1117/12.2065292
Event: XX International Symposium on High Power Laser Systems and Applications, 2014, Chengdu, China
Abstract
Recognizing objects from arbitrary aspects is always a highly challenging problem in computer vision, and most existing algorithms mainly focus on a specific viewpoint research. Hence, in this paper we present a novel recognizing framework based on hierarchical representation, part-based method and learning in order to recognize objects from different viewpoints. The learning evaluates the model’s mistakes and feeds it back the detector to avid the same mistakes in the future. The principal idea is to extract intrinsic viewpoint invariant features from the unseen poses of object, and then to take advantage of these shared appearance features to support recognition combining with the improved multiple view model. Compared with other recognition models, the proposed approach can efficiently tackle multi-view problem and promote the recognition versatility of our system. For an quantitative valuation The novel algorithm has been tested on several benchmark datasets such as Caltech 101 and PASCAL VOC 2010. The experimental results validate that our approach can recognize objects more precisely and the performance outperforms others single view recognition methods.
© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yongxin Chang, Yongxin Chang, Huapeng Yu, Huapeng Yu, Zhiyong Xu, Zhiyong Xu, Chengyu Fu, Chengyu Fu, Chunming Gao, Chunming Gao, " A novel multi-view object recognition in complex background", Proc. SPIE 9255, XX International Symposium on High-Power Laser Systems and Applications 2014, 92553K (3 February 2015); doi: 10.1117/12.2065292; https://doi.org/10.1117/12.2065292
PROCEEDINGS
6 PAGES


SHARE
Back to Top