Paper
14 February 2015 Autonomous landing of a helicopter UAV with a ground-based multisensory fusion system
Dianle Zhou, Zhiwei Zhong, Daibing Zhang, Lincheng Shen, Chengping Yan
Author Affiliations +
Proceedings Volume 9445, Seventh International Conference on Machine Vision (ICMV 2014); 94451R (2015) https://doi.org/10.1117/12.2183270
Event: Seventh International Conference on Machine Vision (ICMV 2014), 2014, Milan, Italy
Abstract
In this study, this paper focus on the vision-based autonomous helicopter unmanned aerial vehicle (UAV) landing problems. This paper proposed a multisensory fusion to autonomous landing of an UAV. The systems include an infrared camera, an Ultra-wideband radar that measure distance between UAV and Ground-Based system, an PAN-Tilt Unit (PTU). In order to identify all weather UAV targets, we use infrared cameras. To reduce the complexity of the stereovision or one-cameral calculating the target of three-dimensional coordinates, using the ultra-wideband radar distance module provides visual depth information, real-time Image-PTU tracking UAV and calculate the UAV threedimensional coordinates. Compared to the DGPS, the test results show that the paper is effectiveness and robustness.
© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Dianle Zhou, Zhiwei Zhong, Daibing Zhang, Lincheng Shen, and Chengping Yan "Autonomous landing of a helicopter UAV with a ground-based multisensory fusion system", Proc. SPIE 9445, Seventh International Conference on Machine Vision (ICMV 2014), 94451R (14 February 2015); https://doi.org/10.1117/12.2183270
Lens.org Logo
CITATIONS
Cited by 6 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Unmanned aerial vehicles

Infrared cameras

Infrared radiation

Radar

Computing systems

Infrared imaging

Cameras

RELATED CONTENT


Back to Top