15 November 2017 Distance-based over-segmentation for single-frame RGB-D images
Author Affiliations +
Proceedings Volume 10605, LIDAR Imaging Detection and Target Recognition 2017; 1060533 (2017) https://doi.org/10.1117/12.2295169
Event: LIDAR Imaging Detection and Target Recognition 2017, 2017, Changchun, China
Over-segmentation, known as super-pixels, is a widely used preprocessing step in segmentation algorithms. Oversegmentation algorithm segments an image into regions of perceptually similar pixels, but performs badly based on only color image in the indoor environments. Fortunately, RGB-D images can improve the performances on the images of indoor scene. In order to segment RGB-D images into super-pixels effectively, we propose a novel algorithm, DBOS (Distance-Based Over-Segmentation), which realizes full coverage of super-pixels on the image. DBOS fills the holes in depth images to fully utilize the depth information, and applies SLIC-like frameworks for fast running. Additionally, depth features such as plane projection distance are extracted to compute distance which is the core of SLIC-like frameworks. Experiments on RGB-D images of NYU Depth V2 dataset demonstrate that DBOS outperforms state-ofthe-art methods in quality while maintaining speeds comparable to them.
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Zhuoqun Fang, Zhuoqun Fang, Chengdong Wu, Chengdong Wu, Dongyue Chen, Dongyue Chen, Tong Jia, Tong Jia, Xiaosheng Yu, Xiaosheng Yu, Shihong Zhang, Shihong Zhang, Erzhao Qi, Erzhao Qi, } "Distance-based over-segmentation for single-frame RGB-D images", Proc. SPIE 10605, LIDAR Imaging Detection and Target Recognition 2017, 1060533 (15 November 2017); doi: 10.1117/12.2295169; https://doi.org/10.1117/12.2295169

Back to Top