22 May 2014 Autonomous target tracking of UAVs based on low-power neural network hardware
Author Affiliations +
Abstract
Detecting and identifying targets in unmanned aerial vehicle (UAV) images and videos have been challenging problems due to various types of image distortion. Moreover, the significantly high processing overhead of existing image/video processing techniques and the limited computing resources available on UAVs force most of the processing tasks to be performed by the ground control station (GCS) in an off-line manner. In order to achieve fast and autonomous target identification on UAVs, it is thus imperative to investigate novel processing paradigms that can fulfill the real-time processing requirements, while fitting the size, weight, and power (SWaP) constrained environment. In this paper, we present a new autonomous target identification approach on UAVs, leveraging the emerging neuromorphic hardware which is capable of massively parallel pattern recognition processing and demands only a limited level of power consumption. A proof-of-concept prototype was developed based on a micro-UAV platform (Parrot AR Drone) and the CogniMemTMneural network chip, for processing the video data acquired from a UAV camera on the y. The aim of this study was to demonstrate the feasibility and potential of incorporating emerging neuromorphic hardware into next-generation UAVs and their superior performance and power advantages towards the real-time, autonomous target tracking.
© (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Wei Yang, Wei Yang, Zhanpeng Jin, Zhanpeng Jin, Clare Thiem, Clare Thiem, Bryant Wysocki, Bryant Wysocki, Dan Shen, Dan Shen, Genshe Chen, Genshe Chen, } "Autonomous target tracking of UAVs based on low-power neural network hardware", Proc. SPIE 9119, Machine Intelligence and Bio-inspired Computation: Theory and Applications VIII, 91190P (22 May 2014); doi: 10.1117/12.2054049; https://doi.org/10.1117/12.2054049
PROCEEDINGS
9 PAGES


SHARE
Back to Top