Translator Disclaimer
6 June 2011 The perception problem and the impact on robotics and computer vision
Author Affiliations +
Why is there a perception problem in robotics? Given the increases in the speed of computer hardware and technology which has followed Moore's Law, why haven't there been commensurate advances in computer perception technology, which would enable a robot to respond appropriately to its environment? Perhaps the algorithms used for perception are not appropriate for the problem? The computer vision problem was assumed to be easy, and the supposedly more difficult challenges of problem solving and decision making were tackled first. As it turned out, problem solving and decision making were handled relatively easily by symbolic representations and predicate logic, however, the perception of the real world turned out to be much more difficult. What are the algorithms that have been used for perception in robotics and why do they sometimes fail at reproducing human-like behavior? How can we learn from biological systems which, through evolution, have made great advances in solving the difficult problems of perception and classification?
© (2011) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
T. D. Kelley, E. Avery, and S. M. McGhee "The perception problem and the impact on robotics and computer vision", Proc. SPIE 8064, Multisensor, Multisource Information Fusion: Architectures, Algorithms, and Applications 2011, 80640B (6 June 2011);


Back to Top