In this work, we propose a new algorithm for matching of coming video sequences to a simultaneous localization and
mapping system based on a RGB-D camera. Basically, this system serves for estimation in real-time the trajectory of
camera motion and generates a 3D map of indoor environment. The proposed algorithm is based on composite
correlation filters with adjustable training sets depending on appearance of indoor environment as well as relative
position and perspective from the camera to environment components. The algorithm is scale-invariant because it
utilizes the depth information from RGB-D camera. The performance of the proposed algorithm is evaluated in terms of
accuracy, robustness, and processing time and compared with that of common feature-based matching algorithms based
on the SURF descriptor.