Presentation + Paper
12 September 2021 Dynamic vision sensor datasets in the maritime domain
Author Affiliations +
Abstract
Event cameras utilize novel imaging sensors patterned after visual pathways in the brain that are responsive to low contrast, transient events. Specifically, the pixels of dynamic vision sensors (DVS) react independently and asynchronously to changes in light intensity, creating a stream of time-stamped events encoding the pixels’ (x, y) location in the sensor array and the sign of the brightness change. In contrast with conventional cameras that sample every pixel at a fixed rate, DVS pixels produce output only when the change in intensity has surpassed a set threshold, which leads to reduced power consumption in scenes with relatively little motion. Furthermore, compared to conventional CMOS imaging pixels, DVS pixels have extremely high dynamic range, low latency, and low motion blur. Taken together, these characteristics make event cameras uniquely qualified for persistent surveillance. In particular, we have been investigating their use in port surveillance applications. Such an application of DVS presents the need for automated pattern recognition and object tracking algorithms which can process event data. Due to the fundamentally different nature of the output relative to conventional frame-based cameras, traditional methods of machine learning for computer vision cannot be directly applied. Anticipating this need, this work details data collection and collation efforts to facilitate development of object detection and tracking algorithms in this modality. We have assembled a maritime dataset capturing several moving objects including sail boats, motor boats, large ships, etc.; as well as incidentally captured objects. The data was collected with lenses of various focal lengths and aperture settings to provide data variability and avoid unwanted bias to specific sensor parameters. In addition, the captured data was recorded with the camera in both static and dynamic states. These different states can be used to mimic potential behavior and help understand how this movement can affect the algorithms being developed for automated ship detection and tracking. We will describe the data captured, effects of hardware settings and lenses, as well as how lighting conditions and sensor movement contributed to the quality of the event data recorded. Finally, we will detail future data collection efforts.
Conference Presentation
© (2021) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Jacob A. Rodriguez, Justin Mauger, Shibin Parameswaran, Riley Zeller-Townson, and Galen Cauble "Dynamic vision sensor datasets in the maritime domain", Proc. SPIE 11870, Artificial Intelligence and Machine Learning in Defense Applications III, 118700G (12 September 2021); https://doi.org/10.1117/12.2600971
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Cameras

Sensors

Algorithm development

Lenses

Detection and tracking algorithms

Surveillance

Computer vision technology

Back to Top