Vanishing points are elements of great interest in the computer vision field, since they are the main source of
information about the geometry of the scene and the projection process associated to the camera. They have
been studied and applied during decades for plane rectification, 3D reconstruction, and mainly auto-calibration
Nevertheless, the literature lacks accurate online solutions for multiple vanishing point estimation. Most
strategies focalize on the accuracy, using highly computational demanding iterative procedures. We propose a
novel strategy for multiple vanishing point estimation that finds a trade-off between accuracy and efficiency,
being able to operate in real time for video sequences. This strategy takes advantage of the temporal coherence
of the images of the sequences to reduce the computational load of the processing algorithms while keeping a
high level of accuracy due to an optimization process.
The key element of the approach is a robust scheme based on the MLESAC algorithm, which is used in a
similar way to the EM algorithm. This approach ensures robust and accurate estimations, since we use the
MLESAC in combination with a novel error function, based on the angular error between the vanishing point
and the image features. To increase the speed of the MLESAC algorithm, the selection of the minimal sample
sets is substituted by a random sampling step that takes into account temporal information to provide better
initializations. Besides, for the sake of flexibility, the proposed error function has been designed to work using
as image features indiscriminately gradient-pixels or line segments. Hence, we increase the range of applications
in which our approach can be used, according to the type of information that is available.
The results show a real-time system that delivers real-time accurate estimations of multiple vanishing points
for online processing, tested in moving camera video sequences of structured scenarios, both indoors and outdoors,
such as rooms, corridors, facades, roads, etc.