Autonomous vehicles (AVs) employ a wide range of sensing modalities including LiDAR, radar, RGB cameras, and more recently infrared (IR) sensors. IR sensors are becoming an increasingly common component of AVs’ sensor packages to provide redundancy and enhanced capabilities in conditions that are adverse for other types of sensors. For example, while RGB cameras are sensitive to lighting conditions and LiDAR performance is degraded in inclement weather such as rain, IR sensors are unaffected by lighting conditions and can contribute additional meaningful information in inclement weather. The US Army Corps of Engineers, Engineer Research and Development Center (ERDC) has developed the ERDC Computational Test Bed (CTB) to provide a suite of tools that can be used to support virtual development and testing of AVs. The CTB includes physics-based vehicle-terrain interaction, sensor and environment modeling, geo-environmental thermal modeling, software-inthe- loop capabilities, and virtual environment generation. Thermal modeling capabilities within the CTB utilize decades of near-surface phenomenology and autonomy research. Recent additions have been made to support large-domains commonly required for autonomous vehicle operations. These additions provide high-fidelity, physics-based thermal transfer and IR sensor models for creating high-quality synthetic imagery simulating IR sensors mounted on AVs. Highly parallelized thermal and IR sensor models for large-domain AV operations will be presented in this paper.
The United State Army Corp of Engineers (USACE) Engineering Research and Development Center (ERDC) has developed a suite of computational tools called the Computational Test Bed (CTB) for advanced high-fidelity physics-based autonomous vehicle sensor and environment simulations. These tools provide insights into onboard navigation, image processing, sensor fusion techniques, and rapid data generation for artificial intelligence and machine learning techniques across the full spectrum (visible, NIR, MWIR, and LWIR) and for various sensor modalities (LiDAR, EO, radar). This paper presents ERDC’s CTB that allows the community to design, develop, test, and evaluate the entire autonomy space from machine learning algorithm development using augmented synthetic data to large-scale autonomous system testing.
As the U.S. Army prepares for future conflicts and multi-domain operations, the need for methods to rapidly and continuously characterize the land-sea interface during littoral entry is paramount to ensure maneuverability across these domains. In the maritime domain, nearshore bathymetry and surf-zone sandbars define water depth and wave behavior, which in-turn drive landing tactics and the feasibility and configuration of littoral operations. In the land domain, beach and dune topography define slopes and transit paths, which drive staging area locations and effect maneuverability of both troops and equipment. Accurately predicting surf-zone state and littoral morphology evolution requires synthesizing a range of complex non-linear physics that drive these changes. Using imagery of the littorals from unmanned aerial systems and physics-based models, the U.S. Army Engineer and Development Center has developed novel data assimilation approaches to estimate water depth, littoral conditions, and beach sub-aerial topography from wave kinematics and photogrammetric algorithms and quantify their corresponding uncertainties. To improve the usefulness (speed of the calculations) and accuracy (accounting for known errors related to optical transfer functions and nonlinear wave dynamics) of this technology during littoral operations, approaches to develop machine-learning based computational tools which can directly translate short-sequences of littoral imagery into surf-zone characterization in real time by substituting or augmenting computationally complex models are being investigated. To accomplish this, a photo-realistic, non-linear wave model, Celeris, is used to generate synthetic imagery of a range of surf-zone environments. This synthetic imagery is crucial to developing the data sets necessary to train deep neural networks to solve the non-linear depth inversion problem from observations of wave kinematics.
It is well established that object recognition by human perception and by detection/identification algorithms is confounded by false alarms (e.g., [1]). These false alarms often are caused by static or transient features of the background. Machine learning can help discriminate between real targets and false alarms, but requires large and diverse image sets for training. The potential number of scenarios, environmental processes, material properties and states to be assessed is overwhelming and cannot practically be explored by field/lab collections alone. High-fidelity, physics-based simulation can now augment training sets with accurate synthetic sensor imagery, but at a high computational cost. To make synthetic image generation practical, it should include the fewest processes and coarsest spatiotemporal resolution needed to capture the system physics/state and accomplish the training.
Among the features known or expected to generate false alarms are: (1.) soil/material variability (spatial heterogeneity in density, mineral composition, reflectance), (2.) non-threat objects (rocks, trash), (3.) soil disturbance (physical and spectral effects), (4.) soil processes (moisture migration, evaporation), (5.) surface hydrology (rainfall runoff and surface ponding), (6.) vegetation processes (transpiration, rainfall interception and evaporation, non-saturating rain events, multi-layer canopy, (including thatch), discrete versus parameterized vegetation), and (7.) energy reflected or emitted by other scene components. This paper presents a suite of computational tools that will allow the community to begin to explore the relative importance of these features and determine when and how individual processes must be included explicitly or through simplifying assumptions/parameterizations. The justification for this decision to simplify is driven ultimately by the performance of a detection algorithm with the generated synthetic imagery. Knowing the required level of modeling detail is critical for designing test matrices for building image sets capable of training improved algorithms.
A related consideration in the creation of synthetic sensor imagery is validation of these complex, coupled modeling tools. Very few analytical solutions or laboratory experiments include enough complexity to thoroughly test model formulations. Conversely, field data collection cannot normally be characterized and measured with sufficient spatial and temporal detail to support true validation. Intermediate-scale physical exploration of near surface soil and atmospheric processes (e.g., Trautz et al., [2]) offers an alternative that is intermediary to the laboratory column and field scales. This allows many field-scale-dependent processes and effects to be reproduced, manipulated, isolated, and measured within a well characterized and controlled test environment at requisite spatiotemporal resolutions in both the air and soil.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.