While interest in image and data fusion has grown significantly, its use in applications has been limited by the requirement that sensory data acquired from multiple sensors be accurately registered both spatially and over time. We present an autonomous stereo-based technique for spatially registering projective and range images acquired from externally uncalibrated sensors. Corresponding features are automatically extracted from range/projective image pairs. These points are then used to estimate the relative sensor pose by minimizing the difference between scanned and predicted stereoscopic range measurements. Experimental results obtained using a LIDAR scanner and a color camera are presented to demonstrate the accuracy of the stereo-based registration procedure.