Calibrating the kinematic parameters of a mobile robot is a time consuming and mandatory procedure, since the mechanical tolerances and the assembly procedures may introduce a large inaccuracy in the nominal parameters. A small error in the calibration might lead to severe inconsistencies in tasks that rely on precise positioning, such as localization, mapping and navigation in general. In a wheeled mobile robot this consists of estimating the odometry parameters, that are required to convert wheel encoder ticks in a relative motion of the mobile base on a local plane, for then computing its position. To tackle this problem, we propose the use of the unscented Kalman filter for estimating the geometrical kinematic parameters of the dynamic model of the mobile robot, while using an external camera, to precisely measure the mobile robot positions in the environment, via a visual tag fixed to the robot. The inclusion of the visual tag in the model corresponds to an eye-to-eye problem, where the fixed transform between the visual tag frame and mobile robot frame is identified. The usage of the unscented version of the Kalman filter, compared to the extended Kalman filter, assures robustness in the estimation of parameters and state in strongly nonlinear system, and to parameters initial values, due to the usage of the unscented transform to propagate the state uncertainty, rather than employing linearization. We validate this method on a 4 mecanum-wheel mobile robot internally developed using a Kinect2 to track the mobile robot movement trough a reference chessboard. The experiments has shown a great improvement in the odometry computation of the two platforms, and a fast convergence of the estimated parameters, furthermore the fixed transform of the visual tag, which corresponds to ah hand-to-eye problem is well identified
|