Presentation + Paper
5 May 2017 A perception pipeline for expeditionary autonomous ground vehicles
Author Affiliations +
Abstract
Expeditionary environments create special challenges for perception systems in autonomous ground vehicles. To address these challenges, a perception pipeline has been developed that fuses data from multiple sensors (color, thermal, LIDAR) with different sensing modalities and spatial resolutions. The paper begins with in-depth discussion of the multi-sensor calibration procedure. It then follows the flow of data through the perception pipeline, detailing the process by which the sensor data is combined in the world model representation. Topics of interest include stereo filtering, stereo and LIDAR ground segmentation, pixel classification, 3D occupancy grid aggregation, and costmap generation
Conference Presentation
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Josh Zapf, Gaurav Ahuja, Jeremie Papon, Daren Lee, Jeremy Nash, and Arturo Rankin "A perception pipeline for expeditionary autonomous ground vehicles", Proc. SPIE 10195, Unmanned Systems Technology XIX, 101950F (5 May 2017); https://doi.org/10.1117/12.2266690
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Sensors

LIDAR

Calibration

Data modeling

Image segmentation

Cameras

Infrared imaging

RELATED CONTENT

Calibration of Geiger-mode lidar systems
Proceedings of SPIE (January 01 1900)
ONR 30 autonomous ground system program overview
Proceedings of SPIE (May 05 2017)
Autonomy at the end of the Earth an inclement...
Proceedings of SPIE (May 19 2020)
Estimation of camera matrix using lidar and aerial images
Proceedings of SPIE (September 13 2011)

Back to Top