The next generation of multi-domain airborne platforms will provide military operators with unparalleled sensor data streams spanning video, radar, and other sensor inputs. These expanded sensor capabilities will substantially increase access to critical, near-real time surveillance. However, the process of interpreting video feeds places a significant burden on intelligence operators, a demand that can be addressed by AI-based algorithms. AI-based tools complement the aerial footage processing tasks performed by full motion video analysts. In this work, we introduce a new aerial pattern of life dataset and describe our latest algorithmic developments, which uses deep learning to gain an understanding of a scene’s patterns of life. This approach allows anomalies, outliers from standard patterns of life, to be identified using supervised and unsupervised learning approaches. Herein, we describe our deep learning models and our corresponding microservices software architecture. The patterns of life and anomaly detection performance is measured through analysis of video from this new remotely piloted aerial system (RPAS) flight campaign dataset.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.