The number of reported incidents caused by UAVs, intentional as well as accidental, is rising. To avoid such incidents in future, it is essential to be able to detect UAVs. However, not every UAV is a potential threat and therefore the UAV not only has to be detected, but classified or identified. 360<sup>o</sup> scanning LiDAR systems can be deployed for the detection and tracking of (micro) UAVs in ranges up to 50 m. Unfortunately, the verification and classification of the detected objects is not possible in most cases, due to the low resolution of that kind of sensor. In this paper, we propose an automatic alignment of an additional sensor (mounted on a pan-tilt head) for the identification of the detected objects. The classification sensor is directed by the tracking results of the panoramic LiDAR sensor. If the alignable sensor is an RGB- or infrared camera, the identification of the objects can be done by state-of-the-art image processing algorithms. If a higher-resolution LiDAR sensor is used for this task, algorithms have to be developed and implemented. For example, the classification could be realized by a 3D model matching method. After the handoff of the object position from the 360<sup>o</sup> LiDAR to the verification sensor, this second system can be used for a further tracking of the object, e.g., if the trajectory of the UAV leaves the field of view of the primary LiDAR system. The paper shows first results of this multi-sensor classification approach.
The number of reported incidents caused by UAVs, intentional as well as accidental, is rising. To avoid such incidents in future, it is essential to be able to detect UAVs. LiDAR systems are well known to be adequate sensors for object detection and tracking. In contrast to the detection of pedestrians or cars in traffic scenarios, the challenges of UAV detection lie in the small size, the various shapes and materials, and in the high speed and volatility of their movement. Due to the small size of the object and the limited sensor resolution, a UAV can hardly be detected in a single frame. It rather has to be spotted by its motion in the scene. In this paper, we present a fast approach for the tracking and detection of (low) flying small objects like commercial mini/micro UAVs. Unlike with the typical sequence -track-after-detect-, we start with looking for clues by finding minor 3D details in the 360° LiDAR scans of scene. If these clues are detectable in consecutive scans (possibly including a movement), the probability for the actual detection of a UAV is rising. For the algorithm development and a performance analysis, we collected data during a field trial with several different UAV types and several different sensor types (acoustic, radar, EO/IR, LiDAR). The results show that UAVs can be detected by the proposed methods, as long as the movements of the UAVs correspond to the LiDAR sensor’s capabilities in scanning performance, range and resolution. Based on data collected during the field trial, the paper shows first results of this analysis.