26 September 1997 Optimizing context-based stereo using genetic feature selection
Author Affiliations +
Abstract
This paper deals with the recovery of a scene from a pair of images, where each image is acquired from a different viewpoint. The central problem is the identification of corresponding points in all views. We use the feature-based approach to find corresponding points. Various types of features have been used previously, where Gabor features showed significant advantages in terms of accuracy and the complexity/accuracy trade-off. The accuracy is measured as the rate of correctly associated pixels. The matching process typically results in a certain number of ambiguous positions, where the best match found is not the desired match. The main contribution of this paper lies in the application of a genetic algorithm for feature selection. This method uses the previously illustrated fact that the amount of ambiguity in the matching process can be quantitatively measured via statistics on the back-matching distances. With this method, the quality of a matching result can be measured without reference disparity data (or ground truth). The fitness function required for the application of genetic feature optimization is defined using these back-matching statistics. The output of the genetic algorithm is an improved feature set, which contains fewer features as the initial set, but yields extremely improved accuracy. We show that the accuracy of the matching result can be much improved by our genetic optimization approach, and we describe the experiments illustrating the results.
© (1997) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Oliver Sidla, Wolfgang Poelzleitner, "Optimizing context-based stereo using genetic feature selection", Proc. SPIE 3208, Intelligent Robots and Computer Vision XVI: Algorithms, Techniques, Active Vision, and Materials Handling, (26 September 1997); doi: 10.1117/12.290294; https://doi.org/10.1117/12.290294
PROCEEDINGS
12 PAGES


SHARE
Back to Top