This paper aims to study the construction of 3D temperature distribution reconstruction system based on depth and thermal infrared information. Initially, a traditional calibration method cannot be directly used, because the depth and thermal infrared camera is not sensitive to the color calibration board. Therefore, this paper aims to design a depth and thermal infrared camera calibration board to complete the calibration of the depth and thermal infrared camera. Meanwhile a local feature descriptors in thermal and depth images is proposed. The belief propagation matching algorithm is also investigated based on the space affine transformation matching and local feature matching. The 3D temperature distribution model is built based on the matching of 3D point cloud and 2D thermal infrared information. Experimental results show that the method can accurately construct the 3D temperature distribution model, and has strong robustness.
This paper aims to study the projector calibration algorithm in omnidirectional structured light (OSL). The traditional projector calibration method can not directly be used in omnidirectional system, because the projector is perpendicular to the omnidirectional camera in our experiment. Therefor, we design a complete algorithm for the calibration of omnidirectional structured light. Firstly, a calibration plane is applied. And a checkerboard calibration board are placed on that and the checkerboard pattern projected from the projector onto that. Secondly, the equation of the calibration plane are computed based on the extrinsic parameters of the calibration board. Thirdly, the corners of the projected pattern are detected in the image captured by omnidirectional camera. Lastly, 3D projected points for each projected corner are obtained based on the ray-plane intersection. We designed a complete set of OSL calibration toolbox based on the proposed methods in Matlab. The proposed method and toolbox in Matlab have been shown to be accurate and easyto-use in projector calibration.
Over-segmentation, known as super-pixels, is a widely used preprocessing step in segmentation algorithms. Oversegmentation algorithm segments an image into regions of perceptually similar pixels, but performs badly based on only color image in the indoor environments. Fortunately, RGB-D images can improve the performances on the images of indoor scene. In order to segment RGB-D images into super-pixels effectively, we propose a novel algorithm, DBOS (Distance-Based Over-Segmentation), which realizes full coverage of super-pixels on the image. DBOS fills the holes in depth images to fully utilize the depth information, and applies SLIC-like frameworks for fast running. Additionally, depth features such as plane projection distance are extracted to compute distance which is the core of SLIC-like frameworks. Experiments on RGB-D images of NYU Depth V2 dataset demonstrate that DBOS outperforms state-ofthe-art methods in quality while maintaining speeds comparable to them.