Images acquired by RGB cameras on Unmanned Aerial Vehicles (UAVs) can be particularly useful to detect crowd in urban areas when restrictive conditions are imposed for the sake of public safety or health , such as during the Covid 19 pandemic. Together with acquired images, opportune pattern recognition techniques have to be considered to extract useful information. In this framework, features capturing the semantic rich information encompassed in VHR images have to be computed. In particular, Deep Neural Networks (DNN) have been recently proved to be able to extract useful features from data [1]. Moreover, in a transfer learning approach, a DNN pre-trained on a data set can be used to extract opportune features, named deep-features, from another data set, belonging to a different applicative domain [1], [2]. Here, a transfer learning technique is presented to produce change maps, detecting how people gathering increases, from VHR images. It is based on deep-features computed by using some pre-trained convolutional layers of AlexNet. The proposed methodology has been tested on a data set composed of several synthetic VHR images, that simulate crowd collecting in a park, as they can be acquired by RGB camera on a UAV flying at 10 meters height from the ground. The experimental results show that the proposed technique is able to efficiently detect change due to new people incoming in the scene or people that get away, with a low computational cost and in a near-real time operative mode.
|