8 December 2011 Self-localization algorithm for mobile robot based on the omni-directional sensor
Author Affiliations +
Proceedings Volume 8003, MIPPR 2011: Automatic Target Recognition and Image Analysis; 80030U (2011); doi: 10.1117/12.901919
Event: Seventh International Symposium on Multispectral Image Processing and Pattern Recognition (MIPPR2011), 2011, Guilin, China
Abstract
In this paper, a simple but effective method for robot self-localization is presented. The spatial neighborhood constraint is incorporated into the preprocessing of the image segmentation. Then it uses a closed cycle with rectification and Hough detection to find the boundary and corners. Depending on the actual size of surrounding environment and the white lines and corners detected last step, the robot can maintain self-localization through two methods. One method uses the two lines, and the other method used triangulation. Finally, a weight value is set between the two methods to realize the self-localization.Actual image sequence from the robot is tested. The robot can be placed anywhere in the environment. The final self-localization results on very different images with significant light change and noise are promising.
© (2011) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Tongwei Lu, Hui Yan, Dan Zhou, "Self-localization algorithm for mobile robot based on the omni-directional sensor", Proc. SPIE 8003, MIPPR 2011: Automatic Target Recognition and Image Analysis, 80030U (8 December 2011); doi: 10.1117/12.901919; https://doi.org/10.1117/12.901919
PROCEEDINGS
6 PAGES


SHARE
KEYWORDS
Image segmentation

Sensors

RGB color model

Corner detection

Environmental sensing

Mobile robots

Distortion

Back to Top