Open Access
3 April 2019 Special Section Guest Editorial: Situation Awareness in Degraded Visual Environments
Author Affiliations +
Abstract
This guest editorial introduces the Special Section on Situation Awareness in Degraded Visual Environments.

A pilot’s vision is key to maintaining overall situation awareness (SA) when piloting a vehicle. Loss of out-the-window cues reduces the operator’s innate ability to sense aircraft attitude and makes tasks such as “see-to-follow” and “see-to-avoid” impossible with unaided human vision. Reduced SA can lead to the loss of vehicle, or worse, the loss of many lives.

Degraded visual environments (DVE) are described as obscurants that reduce operator visibility, such as smoke, haze, fog, dust, rain, snow, or reduced illumination (night). Vehicle structure that reduces direct external viewing (e.g., embedded cockpits or armored vehicles) represents another form of DVE. Degraded environments also include electromagnetic effects (EME) such as GPS jamming or denial and degraded radio frequency environments (loss of communications). Optical degradation includes the effects of dazzlers or laser illumination.

In commercial aviation, reduced visibility from fog or low cloud decks can cause significant delays, resulting in increased operating costs. For helicopters, a reduced visibility condition known as “brownout” is caused when the helicopter’s main rotor blows sand or dust into the air, thus, reducing the pilot’s view of the outside world. Brownouts cost the military hundreds of millions of dollars each year and account for 75% of all military helicopter accidents in the Iraq and Afghanistan operations. Brownout conditions can affect landing spacecraft as well, thus DVE can affect SA across all aerospace vehicles.

Factors that support situational awareness include sensing (both on-board and off-board sensors), databases (e.g., terrain and cultural features), sensor processing (fusion, stitching, feature extraction, and threat detection), data integration, display, and human factors. The intersection of data integration, display, and human factors is of interest to support human situational awareness and decision making in high tempo operations with many near simultaneous events, which could lead to high operator workload.

This special section contains select papers on color perception for augmented reality (AR) displays, using head-worn displays during low visibility approaches, virtual cockpits and a theoretical approach for using quantum entanglement for improved SA. These papers represent a sample of the research work in the area of DVE being conducted internationally. Maintaining SA in degraded environments will continue to be a research topic of interest as long as humans are piloting vehicles.

Biography

Trey Arthur is a senior researcher at NASA Langley Research Center. He received his BS in aerospace engineering from North Carolina State University in 1991 and his MS in aeronautical engineering from the George Washington University in 1997. He is the author of more than 75 peer-reviewed papers. His current research interests include flight deck displays, vision system technologies, and pilot-vehicle interfaces. He is a lifetime member of AIAA.

Jack Sanders-Reed has been active in the area of pilot vision systems and sensing for degraded visual environments for more than 15 years, including developing and flight testing distributed aperture sensor (DAS) system vision systems combining multispectral live and synthetic data. He is currently cochair of the SPIE “Situation Awareness in Degraded Environments” conference. He holds a PhD in physics from Case Western Reserve University and a high tech MBA from Northeastern University. He is the author of over 25 peer-reviewed and conference papers, holds 6 patents, and is a fellow of SPIE. He is currently chief technologist and technical fellow for the Boeing Research & Technology organization. Previously, he worked for MIT Lincoln Laboratory and Picker X-ray (medical imaging). He is also developer of the visual fusion video motion analysis software product.

Niklas Peinecke holds a diploma in mathematics and a PhD in computer science from the Leibniz University of Hannover. He is coinventor of methods for 3-D shape classification and for conflict detection. Since 2007, he has worked at DLR in the areas of sensors, displays, and detect-and-avoid. He has been the project manager for the DLR parts of EU and national projects. His further research interests include computational geometry and computer graphics.

© 2019 Society of Photo-Optical Instrumentation Engineers (SPIE)
Trey Arthur, Jack Sanders-Reed, and Niklas Peinecke "Special Section Guest Editorial: Situation Awareness in Degraded Visual Environments," Optical Engineering 58(5), 051801 (3 April 2019). https://doi.org/10.1117/1.OE.58.5.051801
Published: 3 April 2019
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Visualization

Driver's vision enhancers

Sensors

Visibility

Visibility through fog

Aerospace engineering

Augmented reality

Back to Top