Open Access
1 February 2017 Review of head-worn displays for the Next Generation Air Transportation System
Author Affiliations +
Abstract
NASA Langley Research Center (LaRC) has conducted research in the area of helmet-mounted display (HMD)/head-worn display (HWD) over the past 30 years. Initially, NASA LaRC’s research focused on military applications, but recently NASA has conducted a line of research in the area of HWD for commercial and business aircraft. This work revolved around numerous simulation experiments as well as flight tests to develop technology and data for industry and regulatory guidance. This paper summarizes the results of NASA’s HMD/HWD research. Of note, the work tracks progress in wearable collimated optics, head tracking, latency reduction, and weight. The research lends credence to a small, sunglasses-type form factor of the HWD being acceptable to commercial pilots, and this goal is now becoming technologically feasible. The research further suggests that an HWD may serve as an “equivalent” head-up display (HUD) with safety, operational, and cost benefits. “HUD equivalence” appears to be the economic avenue by which HWDs can become mainstream on the commercial and business aircraft flight deck. If this happens, NASA’s research suggests that additional operational benefits using the unique capabilities of the HWD can open up new operational paradigms.

1.

Introduction

The NASA Langley Research Center (LaRC) has a long-standing mission to conduct research to advance the state-of-the-art in flight deck interface technologies, including visual displays.1 The heritage of this work was once tied closely to the Department of Defense activities, but has changed over the last two decades to focus on research to provide commercial flight crews with proactive, intuitive tools to conduct safe and efficient flights. Since the late 1990s, this research was driven by the White House Commission on Aviation Safety.2,3 While these safety initiatives firmly remain, NASA research is now closely tied to the modernization of the National Airspace System (NAS) known as the Next Generation Air Transportation System (NextGen). The goal of NextGen is to remove many of the constraints in the current air transportation system, support a wider range of operations, and deliver significantly increased system capacity and efficiency well beyond that of current operating levels. One of the key elements is to create a NAS that is resilient, if not immune, to the impacts of weather. Operating concepts emerging under NextGen require new technology and procedures not only on the ground but also on the flight deck.

This paper provides a high-level overview of NASA Langley’s helmet-mounted display (HMD) and subsequent head-worn display (HWD) research as it relates to advancing the state-of-the-art to develop display and head-tracker technology and data for industry and regulatory guidance for head-coupled systems.4 The work highlights the development of these technologies to enhance safety and improve flight efficiencies in a NextGen environment and suggests that the unique capabilities of the HWD can open up new operational paradigms in commercial and business aircraft operations.5

2.

Background

The work at NASA LaRC, while advancing the state-of-the-art, has primarily focused on the pilot–vehicle interface technologies of HWDs, roughly categorized into four areas. First, the viability of commercial HWDs is critically dependent on the ergonomic or anthropomorphic issues of HWDs. Fortunately, the consumer market and its drive for small, lightweight displays for consumer applications have not required significant technology investment to move the state-of-the-art for commercial flight deck applications.

The second research area is physiological. HWDs may induce a variety of visual-vestibular interactions, each of which can have a significant impact on the user.

Third, perception and perceptual issues figure prominently on a list of pilot–vehicle interface research needs. The visual perception, comprehension, and understanding of HWD information depend on many parameters, not the least of which includes the optical performance characteristics of the HWD. In short, perception issues create significant challenges to match the human visual system with the human mind as it relates to the processing and interpretation of the visual stimuli.

Finally, the fourth area of research need is operational. According to Velger,4 how much and what type of information and how it should be displayed represent perhaps one of the “most indeterminate and insufficiently defined subjects” (p. 179). Some years later, the issues associated with operational needs and how to meet these needs with appropriate informational display content still loom large with research needs that include6:

  • What is the appropriate amount and type of information to be displayed?

  • What is the most effective presentation of that information?

3.

HWD/HMD Research at NASA Langley

3.1.

Early HMD Research

The HMD has a theoretical existence that dates back a hundred years.7 However, it was not until the 1980s that advanced display capabilities really started to emerge where information beyond a simple gun-sight could effectively be employed in a head-worn device. The equipment during the 1980s was technologically advanced but not practically viable for use on a flight deck (see Fig. 1). This state-of-the-art system used binocular optics with 80 deg circular oculars and a 40 deg stereo overlap generated by monochromatic green cathode ray tubes (CRTs) drawing 875 scan lines each. The system employed an alternating current head-tracker. The head-borne weight was 9 pounds, not including the active cooling provided by the air conditioning hose. This particular HMD was not used in any published research but was the predecessor of the HMD used in the high angle-of-attack research vehicle (HARV)8 research (see Fig. 2).

Fig. 1

Monochrome binocular HMD circa 1988, 9 pounds.

OE_56_5_051405_f001.png

Fig. 2

Monochrome binocular HMD circa 1991, 6.5 pounds.

OE_56_5_051405_f002.png

3.1.1.

High angle-of-attack research

This HARV HMD and others like it supported numerous military applications including U.S. Army’s AH-64 Apache attack helicopter, emerging Comanche program requirements, and high angle-of-attack research, such as that deployed on the NASA thrust-vectored F-18 HARV.913 The research at LaRC emphasized stereo/binocular effects and off-boresight informational constructs.

For the HARV Program, HWD research targeted the obvious emphasis of off-boresight capabilities, especially as it applies to the extreme maneuvering envelope provided by this vehicle. The NASA LaRC-developed research HMD had a wide-field-of-view (FOV) (30 deg vertical by 40 deg horizontal) binocular optics and holographic optical elements for high brightness and transmissivity (see Fig. 2). Two high-resolution 1280×1024 CRTs were used as the image sources. The HMD weighed 6.5 pounds and could be worn by most pilots for over an hour without discomfort. The HMD was driven by a graphics workstation updated at a 60-Hz noninterlaced rate and used a Polhemus magnetic head tracking system.

A piloted simulation study was conducted using this system to determine whether attitude information (pitch ladder, velocity vector symbol, and waterline symbol) displayed in a HMD should be presented with respect to the real world (conformal Earth-referenced) or to the aircraft (aircraft body axis-reference) for spatial awareness in a fighter aircraft. With the conformal presentation, the appearance of the displayed information was dependent on the pilot’s head position. The horizon line would always overlay the horizon of the outside scene, if it was in view; however, the attitude of the aircraft (nose position) could not always be easily obtained unless the pilot’s line-of-sight was aligned with the aircraft’s body axis. With the body-axis concept, the information was displayed as if the pilot was always looking directly out of the front of the aircraft irregardless of the pilot’s head orientation. This concept was analogous to physically mounting a head-up display (HUD) to the helmet. Although the pilot could directly determine the aircraft’s attitude, in situations where the pilot’s line-of-sight was not aligned with the aircraft’s body-axis, the horizon line, if in view, would not overlay the horizon of the outside scene. The two display concepts were evaluated using simulated air-to-air intercept tasks where the pilot was to obtain a gun solution on a maneuvering, but not interactive, target.

The quantitative results favored the body-axis concept. Although no statistically significant results were found for either the pilots’ understanding of roll attitude or target position, pitch judgment errors were made three times more often than with the conformal display. The subjective results showed that the body-axis display did not cause attitude confusion, which was a prior concern with this display. In the posttest comments, the pilots overwhelmingly selected the body-axis display as the display of choice. The pilots stated that the conformal display was hard to interpret and confusing because of the symbology motion caused by the aircraft and head movements. The pilots also commented they were more familiar with the body-axis display format because they had flight experience using HUDs. With more training, the conformal display may have been more useful to the pilots.

3.1.2.

Microvision HMD

In 2004, research was conducted using a military-style helmet (see Fig. 3) for synthetic vision (SV) system flight deck concepts.14 The HMD was part of the Virtual Cockpit Optimization Program (V-COP) effort under the US Army and used the microvision virtual retinal display concepts. The HMD was full color, fully binocular, fully overlapped, 1280×1024  pixel resolution display. The Asension LaserBird™ was used for head tracking (see Fig. 4). The optical performance data showed some outstanding characteristics although the system suffered from reliability issues typical of many prototype technologies.15

Fig. 3

Full-color binocular prototype HMD by Microvision circa 2003, 10 pounds.

OE_56_5_051405_f003.png

Fig. 4

Ascension LaserBird head tracker. Three prong sensor mounted on the back of the HMD shown in Figs. 3 and 9.

OE_56_5_051405_f004.png

NASA’s use of this military display focused on the potential for a helmet display with an unlimited field-of-regard to greatly increase pilot situation awareness (SA), both in-flight and on the surface for commercial operations. The data showed that the value of off-boresight information in commercial and business aviation—a unique attribute of a head-coupled HMD—lies within surface operations. For commercial operations, off-boresight information during departure, cruise, and approach and landing operations is not typically important. The majority of the information needed for the task is in front of the vehicle or not the responsibility of the flight crew. For instance, merging aircraft operations for airport arrivals are deconflicted by and are the responsibility of air traffic control. Conversely, off-boresight information during surface operations is critically important to the flight crew. On the airport surface, the crew is responsible for see-and-avoid and airfield navigation predominately using visual out-the-window (OTW) references. The research showed that an HWD opened significant operational and safety benefits; however, the data also evinced that a large, military-style helmet would be a nonstarter for commercial crews.

Pilots expressed a strong desire for a lightweight sunglasses form-factor type display (see concept in Fig. 5 and developed prototype in Fig. 6).

Fig. 5

Concept picture of a lightweight sunglasses form factor HWD for commercial aviation use.

OE_56_5_051405_f005.png

Fig. 6

Lumus DK-32 display glasses coupled with a prototype head tracker made by Thales Visionix.

OE_56_5_051405_f006.png

3.2.

Why an HWD for Commercial and Business Aircraft?

From this military heritage, several factors emerged that created a confluence of needs and capabilities for HWD research at NASA LaRC targeting commercial and business aircraft flight decks. These factors were: (a) consumer displays technologies; (b) HUDs and safety of flight; and (c) vision system technologies.

3.3.

Vision System Technologies

Starting in the early 1970s, vision system technologies—SVS and enhanced vision system (EVS) and related instantiations—were being researched by NASA. Vision system technologies (SV/EV) create an electronic means of visibility for the flight crew, independent of the prevailing natural lighting or atmospheric conditions. These vision system technologies replicate, supplement, or enhance the natural vision of the pilot by means of complementary technologies.16

EV is an electronic means of displaying the external scene topography (the natural or man-made features of a place or region especially in a way to show their relative positions and elevation) through the use of an imaging sensor, such as a forward looking infrared (FLIR) or millimeter wave radar. SV is a computer-generated image of the external scene topography that is generated from aircraft attitude, high-precision navigation, and data of the terrain, obstacles, cultural features, and other required flight information.

Vision system technologies moved to the forefront of NASA’s aeronautics research mission directorate as part of a project to develop and deploy vision system technologies to mitigate the leading cause of commercial aviation accidents world-wide—controlled flight into terrain (CFIT).17 The focus of this program was to get these technologies into existing flight decks to have the greatest impact on CFIT accident reduction.

In 2004, the concept of vision system technologies took root when the Federal Aviation Administration (FAA) amended the operating regulations for takeoff and landing under instrument flight rules contained in Section 14 of the code of federal regulations, §91.175, to allow the use of an enhanced flight vision system (EFVS). This rule change set an important precedence as the first operational credit provided to an imaging sensor system, creating an allowable electronic means of vision for the flight crew. EVS technologies were maturing rapidly as evidenced by a large FAA flight test,18 with NASA and others,19 that highlighted the operational potential of these systems above and beyond what is now provided under current FAA regulations.

Similarly, the fundamental technologies for SV—a computer-generated rendering of stored terrain topography and obstacle data from the perspective of the pilot—have proliferated. In fact, when the NASA program started, it was felt by many as being too revolutionary. The state-of-the-art computer at program launch (circa 1997) was a 266-MHz Pentium processor with 32-megabytes RAM and a 5-gigabyte hard-drive. Fortunately, by the program’s end in 2007, the 1997 state-of-the-art computer capabilities were far exceeded by cell phone technology and the three key technologies (global positioning system, computer graphics rendering, and flash memory) were common-place and the requisite technologies prevalent. SV is now seen as the baseline standard in flight deck designs.

Today, vision system technologies and the concept of an electronic means of visibility for the flight crew, independent of the prevailing natural lighting or atmospheric conditions, are critical pieces of the NextGen architecture.

3.4.

Head-Up Display

HUDs have been available on commercial and business aircraft for many years and, by the mid 1990s, were becoming an accepted flight deck staple, although still not universally adopted (HUD weight and volume are two of the issues hindering their adoption; the use of an HWD, as discussed later, may address these reservations). The HUD has proven itself a valuable addition to the flight deck, yielding many safety and operational benefits. The advantages of HUDs for commercial aircraft are derived from the “eyes-out” conformal view of the outside world with symbology and imagery overlay [i.e., augmented reality (AR)] without the requirement to go “heads-down” to look at flight instruments. The conformal display of attitude, flight path, and energy management information is a key. The Flight Safety Foundation (FSF)20,21 concluded that HUD technology such as Rockwell Collins’ head-up guidance system (HGS)™ would likely have positively influenced the outcome of hundreds of accidents included in a study of turbine-powered, modern glass cockpit aircraft accidents.

These HUD capabilities became an integral part of the EFVS rule making. The HUD is the only display currently certified and approved for use as an EFVS. The operating paradigm of the EFVS is the conformal display of imaging sensor data on the HUD, with conformal flight symbology overlaying the real-world, when enhanced and/or natural vision is available. The EFVS operational credit [as per §91.175 (l) and (m)] explicitly expressed that the use of a HUD was an essential “characteristic and feature” of the EFVS operation.

In developing the rule, the FAA recognized emerging technologies and placed within the rule provisions for the use of a HUD or an “equivalent display” following developments such as the “Virtual HUD” concept.22

3.5.

HWD for Commercial Aviation

As NASA and industry were maturing HUD and vision system technologies, the small display form-factor that the pilots desired was just emerging. The technologies for a viable HWD on a commercial flight deck were approaching.

The HWD research at LaRC with this emerging technology was primarily driven by two operational paradigms:

  • 1. The HWD may be a “HUD equivalent.” In this scenario, the HWD can provide the aforementioned HUD benefits to operators such as EFVS, as well as be advantageous where: (a) HUD installation is not possible or practical by volume or weight; (b) HUD retro-fit is not cost-effective; and; (c) HWD installation has a return-on-investment advantage to the HUD such as providing an aircraft weight reduction.5

  • 2. The HWD can provide unlimited field-of-regard; hence, the HWD may provide operational enhancements (safety and/or efficiency) that would otherwise not be possible, especially since there are now vision system technologies that are unlimited in the field-of-regard and that can be effective independent of the prevailing natural visibility.

The “easy” part of the research was defining the HWD requirements for these operational paradigms. The requirements are simply:

  • 1. equivalent optical performance as today’s HUD (as defined in such documents as SAE ARP-8105);

  • 2. no more encumbrances to the pilot as that provided by today’s head-worn devices.

The extremely challenging part of this task is to simultaneously meet these two requirements. These requirements are diametrically opposed:

  • State-of-the-art HUDs are fantastic optical devices, with outstanding clarity, transmissivity, and—by design and also because they are firmly attached to large aircraft—outstanding symbolic and image stability (i.e., static and dynamic accuracy). The HUD and EV sensor are boresighted, aligned, and firmly affixed to the airframe for milliradian accuracies (e.g., symbol position accuracy of 5.0  milliradians, as per SAE Aerospace Standard 8055 and RTCA DO-315A). Because of the relatively benign movement of the vehicle in the operational flight environment, in a recent EFVS flight test, the end-to-end latency in an EFVS presentation on a HUD was measured to be greater than 200 ms due to the combination sensor, image processing, and HUD processing,23 but each of the test pilots found the presentation to be excellent and latency was never once considered an adverse factor.

  • An informal survey found that there were a variety of head-sets that pilots sometimes, but not always, wore. The type, size, and function varied. The one item of head-worn gear that almost all pilots wore was sunglasses. As such, the goal was set to sunglasses’ “equivalent encumbrance.” Unfortunately, low encumbrance devices, like sunglasses, do not typically afford stable, firm attachment mechanisms. The movement of the pilot’s head, unlike the aircraft, easily exceeds 200 to 300  deg/s, making the head-tracking requirement needed for milliradian static and dynamic accuracies extremely challenging.24

3.6.

Head-Tracking Technology

To meet the operational paradigms and their associated performance requirements, the HWD must be coupled with a high-performance head-tracker. The tracker, while being lightweight and unobtrusive, must meet HUD-like static and dynamic accuracy requirements.

Under the Small Business Innovative Research (SBIR) program, NASA Langley awarded a contract to Intersense, Inc. (now Thales Visionix) to develop a head tracker to meet these performance requirements as well as the need for the HWD system to be minimal encumbrance.

Intersense delivered two prototype head trackers under the SBIR contract. The first tracker was a prototype based on their IS-1200 VisTracker™ using an inertial tracker where the inertial drift was corrected by image processing (see Fig. 7). The focus of the prototype was miniaturization of the tracker by utilizing a cell phone camera. The small tracker size was achieved; however, the tracker used visible light for inertial-drift correction, which was adversely affected by low lighting conditions.

Fig. 7

Intersense prototype tracker developed for NASA Langley.

OE_56_5_051405_f007.png

To improve upon the shortcomings of the first tracker, intersense delivered an infrared-based camera that solved the visible light issue of the previous tracker but at a cost of a larger tracker (see Fig. 6). This second tracker was also a hybrid system using inertial with optical tracking and image processing to correct the drift.

3.7.

System Latency Measurement

Head-tracker latency dictates the dynamic symbol/imagery positioning accuracy. An end-to-end latency requirement of no more than 20 ms has been proposed for virtual HUD applications based on the previous work.5 However, the acceptable latency may become significantly smaller if dynamic stability is a driving requirement. The SAE AS8055 document, the “Minimum Performance Standard for Airborne Head Up Display (HUD),” suggests that this is the case.

The technical challenge is that this allowable latency is an “end-to-end” requirement and not just a head-tracker problem. A basic HWD with head tracking system, from end-to-end, is comprised of: (1) a near-to-eye display, (2) the head tracking system, (3) one or more symbology or image sources, (4) and the display/image processor. Each element and the communication between them contribute a portion to the total latency. No commercial or standardized device was available to measure and quantify end-to-end latency. Therefore, NASA developed a prototype Head Mounted Display Latency Measurement Rig (HeLMR) for this purpose.5 The HeLMR apparatus consists of an anatomically correct human head that is able to “wear” available commercial and custom HMD systems (see Fig. 8). A camera is installed in place of the eye(s) in the correct image plane location. The head is mounted on a precision rotary stage that moves the head in a left–right–left “No–No” fashion at a precise angular rate. To measure the end-to-end latency, a space-stabilized symbol is rendered on the HWD along the boresight. During head movement, the space-stabilized symbology becomes misaligned with respect to the outside reference, proportional to the end-to-end system latency and the head form angular rate. The system can be swept in frequency to create the transfer function between head movement and display response for additional diagnosticity.

Fig. 8

HeLMR system (head, camera, rotary, stage controller, light-source).

OE_56_5_051405_f008.png

4.

Human-in-the-Loop HWD Research

4.1.

Monocular, Biocular, and Binocular Displays

HWDs may be monocular, binocular, or biocular in design; each has their advantages and disadvantages and individually imparts different human factor concerns. For example, binocular rivalry and disparity are known physiological issues caused by HWD design and affect perception. Rotational, magnification, alignment, latency, and/or luminance differences between the displays for each eye in a binocular or biocular design can detrimentally affect the usability of the HWD. Double imagery, adaptation, motion effects, and many other concerns can impact not only the perception of the imagery but also the user cognition (e.g., effects on decision-making, response time, accuracy, judgments, change blindness, attention, visual search, memory, problem-solving, SA, etc.), as well as physiological consequences due to the perceptual and/or visual stimuli mismatch (e.g., eye strain, disorientation, headaches, and sickness).25 Each of these issues is addressed in many ways throughout the existing HMD literature, but much of this existing literature is also heavily biased toward the military application or academia and its youthful subject population. Our application, conversely, has to consider pilots up to the age of 65, if not older, with a variety of vision conditions, corrections, and color-deficiencies. HWD design considerations, especially the use of color and the optical design, as they interact with the HWD user age are somewhat unique in this new application of HWDs.

Honeywell performed research for NASA to examine issues with monocular versus biocular (same image viewed by both eyes) HWD displays.6 Honeywell utilized Microvision Nomad™ displays with an Ascension Phasor Bird™ head tracker to create the monocular and biocular HWD display conditions. The three display conditions examined: monocular display on the dominate eye, monocular display on the nondominate eye, and a biocular display. Results showed no significant differences in flight performance between the three display concepts. An interesting result from the Honeywell study was that the monocular display was significantly more accurate in terms of visual acuity compared to a biocular display. No binocular rivalry effects were found in this seasoned group of aviators. Thus, NASA began surface operations research with a lightweight monocular HWD.

4.2.

Taxi Operations

Our research showed that one operational area where off-boresight information for commercial and business transport aircraft is deemed critical is surface operations. Furthermore, at the beginning of the 21st century, the National Transportation Safety Board (NTSB) continually included runway incursion prevention on its “top most wanted” list for aviation safety.26 On this basis, the application of HWD technology to address surface operations was prioritized and researched.

On the flight deck, the available OTW visibility provides the “truth,” and preferably two but at least one pilot is always head-out during surface operations. The HUD or HWD must be designed to not obscure this outside view. Today, HUDs are certified only after demonstrating that they are in compliance with Part 14 of the code of federal regulations §25.773, showing that the HUD design still gives the pilots “a sufficiently extensive, clear, and undistorted view, to enable them to safely perform any maneuvers within the operating limitations of the airplane, including taxiing, takeoff, approach, and landing.” The HUD or HWD should ideally augment the prevailing visibility—providing sufficient information to enhance or enable the operation—without significantly obscuring it.

Full-color HWD display concepts were evaluated in surface operations in direct or indirect comparison using previous research against HUD equipage for taxi route awareness, traffic awareness, taxi efficiency, and runway incursion prevention.

4.2.1.

HWD down-select usability study

An HWD usability study was used to down select concepts for feasibility.24,27,28 The usability study was conducted in the medium fidelity, fixed-based visual imaging simulator for transport aircraft systems simulator. The usability study was designed to demonstrate the efficacy of an HWD that provides unlimited field-of-regard SV for surface operations. The results demonstrated that providing pilots with the ability to virtually see well beyond the natural visual range can significantly increase SA and task performance on the airport surface. Pilots were better able to perform the taxi task and reported significantly higher SA with the HWD concepts compared to an electronic moving map (EMM) or paper charts of the airport environment. Furthermore, the study provided tremendous insight into future design and development of HWDs, including hardware considerations and methods for integration of display modes.

The usability study highlighted two significant hardware considerations. Nearly all pilots preferred the higher resolution (800×600), see-through HWD over the lower resolution (640×480), non-see-through HWD. Pilots commented that the higher resolution improved the readability of the display, especially for text and numbers. Additionally, the pilots preferred not to have their forward vision blocked, even by the small 640×480  pixel display. The see-through capability allowed the pilots to continue their nominal out the window surveillance of the airport environment during taxi. Also, the see-through display provided the pilots with confidence that the display was aligned with the scene. For surface operations, it is important for an HWD to be see-through because, for all practical purposes, the HWD must provide an “AR,” not a “virtual reality” condition since some—albeit possibly restricted—natural visibility is always available [i.e., the lowest visibility approved for commercial surface/flight operations is 300-foot runway visual range (RVR)].

4.2.2.

High-fidelity simulation experiments

From the results of the usability study, two experimental studies were conducted to determine the efficacy of using HWDs to enhance taxi operations.29,30 For both experiments, full-color monocular HWD display concepts were evaluated to address previously witnessed display technology limitations.

The experiments were conducted in NASA’s high-fidelity simulator known as the research flight deck (RFD). The RFD was equipped with a 30-deghorizontal×24-degvertical HUD on the captain’s side. The HWD, worn only by the captain, was a Liteye LE-500 800×600  pixel, full-color monocular display with see-through capability (Fig. 9). The head tracker was a laserBIRD™ model by Ascension Technology Corporation. A skateboard helmet was used as the mounting location for the display and tracker. The helmet was not grossly heavy but was sturdy enough with good stability without discomfort in fitting. It also had some user acceptance from an aesthetic viewpoint.

Fig. 9

Full-color monocular HWD circa 2007, 2 pounds. Outside-in laser head tracking.

OE_56_5_051405_f009.png

A so-called “semi-conformal” presentation method was used to address the obscuration issue. The pilots placed the display just above their right eye so that it was visible by glancing up, which maintained unimpeded stereoscopic vision for OTW monitoring. The resulting display was conformal to the real-world (OTW scene) if the pilot tilted his or her head down. This “semi-conformal” presentation allowed pilots to bring the display into view when they desired by head-movement as opposed to a flight deck switch action.

The pilots conducted simulated taxi operations at Chicago O’Hare International Airport. The primary display conditions were varied by having no HUD, an HWD, or a traditional HUD. A head-down EMM display was varied from a basic moving map to an advanced moving map that included the display of the aircraft’s cleared routing and other traffic. The display condition and weather were experimentally varied. A total of 27 different taxi scenarios were flown. Three of the 27 scenarios were “rare-events” to test off-nominal, safety-critical stress cases. All taxiing tasks involved exiting the active runway and taxiing to the airport movement area boundary. The weather state for the OTW scene was varied between night-time with unlimited visibility visual meteorological conditions and daytime with 700-foot RVR.

The performance data showed that better route accuracy and faster taxi speeds can be obtained using the HWD or HUD compared to paper charts alone. On average, the pilots completed the taxi route 15% faster with the HWD and HUD concepts.

Comparing the HWD and HUD concepts across both experiments, there were no significant differences. Therefore, in terms of taxi performance, the HWD and the HUD were statistically equivalent.

4.2.3.

Augmented reality surface operations

From both experiments mentioned above, the “semi-conformal” concept was successful, especially for clutter prevention, but the concept was not perfect. The pilots had to use head-movement to use the display. Better methods were desired. Instead of using a “semi-conformal” display, an alternate approach was developed to augment the scene and not obscure the prevailing natural visibility. Figure 10 graphically depicts an instantiation of this AR HWD concept for surface operations. The boxed area containing symbology and the virtual airport represents the view as it would be rendered on the HWD. Outside the boxed area represents the visibility of the actual airport environment with natural vision. In this example, this figure represents a reported visibility of 700-foot RVR. The AR concept draws the SV/EV imagery (i.e., the “virtual airport environment”) on the HWD, only in the area that is beyond the reported 700-foot RVR (i.e., a so-called “Beyond-RVR” AR concept). Note that the virtual airport is not shown in the HWD up to 700-foot RVR as this portion of the actual airport can be seen with natural vision (i.e., the unaided eye). Essentially, the virtual airport imagery was culled up to 700-foot RVR to allow pilots to view the actual airport environment. The cleared route (shown as the magenta ribbon) was drawn on top of the yellow taxiway centerline to denote the cleared path. The cleared path always overlays the taxiway centerline if in view on the HWD as it is critical information for runway and taxiway incursion prevention. Traffic within the conformal display is depicted with the conformal traffic diamond symbology.

Fig. 10

Surface concept with the HWD.

OE_56_5_051405_f010.png

Pilots were asked to taxi complex routes, under simulated low-visibility conditions (300-foot RVR, 600-foot RVR, 2400-foot RVR), in a large commercial transport aircraft simulator at Chicago O’Hare airfield. The test conditions evaluated the “Beyond-RVR” concept and the nominal HWD configuration, as used in prior studies.

The results of the research evinced that the HWD, regardless of concepts tested, significantly enhanced SA compared to nominal aircraft equipage and displays used during aircraft taxi. Although no quantitative differences were found, the usefulness of the Beyond-RVR concept had merit only at higher visibilities. At very low-visibility operations, there were no discernible differences to the pilots between the Beyond-RVR and nominal HWD concepts tested (i.e., the 300-foot RVR and 600-foot RVR condition) as they looked practically the same. Pilot comments supporting the value of the HWD coupled with SV/EV imagery to increase capacity and safety in the airport movement area show that this innovative display has merit and that technology maturation should continue.

The results from these surface operations with HWDs led to NASA patenting the technology.31 Follow-on experiments continue to refine surface operations research with HWDs.

4.3.

Simultaneous Offset Instrument Approaches Using an HWD

Although the majority of approach and landing operations do not require off-boresight information, a significant hurdle to overcoming capacity constraints at over 30 airports32 in the NAS is simultaneous-dependent parallel runway operations at runways separated by less than 4300 feet.33 In particular, very closely spaced parallel runways have equipment and procedural requirements that significantly limit their use in degraded visual conditions. When the weather drops below visibility or ceiling minima, an airport is reduced to single runway operations, substantially reducing arrival rates [e.g., typically from 65 to 30 aircraft/h at San Francisco (SFO) airport].

Simultaneous-dependent parallel runway operations require a pilot to see-and-avoid parallel traffic. To meet the “weather-independence” goal of NextGen, game-changing technology is needed to safely overcome these constraints. Arthur et al.30 reported a study that demonstrated the usefulness of HWDs, paired with other technologies, to more effectively address this problem. The research examined the use of an SV/EVS HWD concept, in concert with flight deck interval management (FIM) technologies, to conduct very closely spaced simultaneous-dependent parallel runway operations under restricted visual conditions. The simulation experiment evaluated head-down and HUDs (both HUD and HWD), paired with vision systems and FIM technologies, to conduct these dependent arrival operations.

The HWD symbology consisted of a velocity vector, counter pointers for airspeed and altitude as defined by MIL-STD-1787B,34 and a target locator box. The counter pointers were screen-referenced, and the velocity vector and target locator box were Earth-referenced. The target locator box was driven by simulated automatic dependent surveillance-broadcast (ADS-B) and showed the traffic call sign on top, the vertical trend via an arrow and vertical speed in feet per minute.

In general, pilots rated using an HWD in instrument meteorological conditions the same as the visual approach (unlimited visibility) in terms of SA and mental workload. The results were based on some simplifying technological assumptions but nonetheless provide an outlook of the potential of HWDs to serve as an enabling technology and flight deck display platform to facilitate the envisioned path toward the goals set forth by NextGen and future air transportation systems.

4.4.

HWDs for Spacecraft

In 2007, NASA began exploring technology for a human-mission to the moon under the Constellation Program.35 During Apollo, the constraints placed by the design of the lunar module window for crew visibility and landing trajectory were “a major problem.” The new lunar lander spacecraft, known as Altair, was being designed to have a significantly reduced OTW look-down angle compared to the Apollo lunar lander36,37 and use optimal fuel saving trajectories that render the natural vision of the crew from windows inadequate for the approach and landing task. The Altair program desired technology to overcome these visibility constraints. Thus, a lightweight HWD system with SV, and perhaps EV technologies, offered a potential solution for spacecraft crews. A part-task simulation was conducted to explore using a monocular HWD (see Fig. 11) for a lunar lander spacecraft.38,39

Fig. 11

Monochrome monocular HWD circa 2010, 4 ounces.

OE_56_5_051405_f011.png

A fixed-based human-in-the-loop lunar landing simulation study was conducted using various entry trajectory options and HWD concepts for visibility. In general, pilots thought the HWD had great potential but was not optimized for this lunar landing task. The performance data show that there were no significant performance differences when using the HWD in conjunction with the head-down displays (HDDs). Also, there were no significant workload effects with the HWD. Regarding SA, the data are anecdotal. Some pilots felt the HWD provided greater SA because the HWD allowed for an eyes-out view while still being able to perform the task. Pilots preferred this eyes-out view as it provided “truth” data as to what the actual situation is, rather than relying on a computed navigation solution, which can be subject to errors. These comments were predominately from pilots who were familiar with and frequently flew night vision systems, which are monocular and monochrome green; therefore, they were used to and familiar with having a monocular display over one eye while flying.

However, those pilots who had not flown such systems were distracted by the HWD at times. The artifacts associated with a head tracked display, such as blurry text and numbers, caused pilots to abandon using the HWD and rely solely on the HDDs. These pilots would try to keep their heads still to reduce latency effects, which reduced the readability of the HWD. All pilots agreed that an HWD with a larger FOV would be desirable; however, pilots were not asked to quantify how much larger the FOV should be (and a larger FOV HWD was not available for testing).

With head-tracked HWD systems, many factors can affect the image quality and, thus, the acceptability of the HWD.40 In this experiment, synthetic terrain was rendered on the HWD. The alignment of the synthetic terrain to the real terrain (OTW is considered truth for this experiment) is dependent on the static boresighting as well as the system latency. Latency will cause an apparent misalignment during head movement, but as the pilot’s head comes to a stop, the terrain will appear to “catch-up.” This terrain “swimming” can lead to simulation sickness and loss of confidence in the fidelity of the system. For this experiment, neither simulation sickness nor integrity of the HWD system was a concern.

Binocular rivalry effects were not observed with the HWD system used in this experiment.41

4.5.

HWD as an Equivalent HUD

As mentioned, the FSF identified significant safety benefits of head-up/HUD flight operations.20 In addition to safety benefits, “operational credits” are now being derived from HUD equipage that an HWD might also obtain if “HUD equivalence” can be shown. In particular, the EFVS operational credit [as per §91.175 (l) and (m)] explicitly expressed that the use of a HUD was an essential “characteristic and feature” of the EFVS operation. Two tests were conducted to assess the state of the HWD technologies to meet the provisions for the use of an HWD as an equivalent display to the HUD.

4.5.1.

Simulation experiment

Testing was performed in the RFD full-mission, motion-based simulator to do a direct HUD versus HWD comparison in an EFVS operation. Twelve airline crews conducted approach and landing, taxi, and departure operations during low-visibility operations (1000-foot RVR, 300-foot RVR) at Memphis International Airport. The HWD used in this experiment was coupled with a prototype head-tracker mounted on the left side of a pair of Lumus DK-32™ glasses. The Lumus eye-wear is see-through and full color, utilizing patented light-guide optical element (LOE) technology to generate an image that appears at “practical” infinity similar to that of a HUD. For this experiment, only monochrome green symbology and imagery were displayed on the HWD as not to introduce a confound when comparing to the monochrome HUD.

The HWD symbology replicated the RFD’s Rockwell Collins HGS-6700™ HUD symbology and its functionality. The flight symbology set was typical for a commercial transport but also included a flare cue and other critical symbology elements meeting the required conformal elements for an EFVS. This symbology included airspeed and altitude tapes, a conformal flight path marker, flight path angle reference cue, raw data instrument landing system (ILS) glideslope and localizer deviations, and a flight director cue. Conformal symbology and the HUD create an AR display that is an essential element of the EFVS with the HUD and, now, the HWD equivalent display. The conformal information directly informs the pilot of the aircrafts flight trajectory with respect to the intended landing point, the aircraft energy state and its trend of the aircraft, and the trajectory of the aircraft in the airspace to remain clear of obstacles along the approach path. The use of conformal symbology (AR) has been found to be beneficial for the performance and user attention switching.42

After landing, when the nose wheel was on the ground and the ground speed was less than 80 knots, the flight symbology set would automatically transition to the taxi symbology. The taxi symbology set consisted of ground speed, heading, the current taxiway identification (i.e., the taxiway that the aircraft was on), and the next taxiway identifier on the cleared route. Above the next taxiway text, a left or right arrow was rendered to denote the direction of the next cleared taxiway. Near the bottom of the display was a raw data indicator showing linear deviation from the taxiway centerline. These symbology sets were displayed on both the HUD and the HWD.

The EV was simulated as a combined short-wave, mid-wave (1.0 to 5.0  μm) FLIR sensor. The simulated camera was aligned with the HUD, so any image shift between the FLIR displayed on the HUD and the OTW was due only to installation parallax. The image shift (i.e., error) due to camera parallax for this case was half of the maximum error allowable for an EFVS in accordance with RTCA DO-315,43 equating to a 2.5-milliradian image offset of a point located at a distance of 2000 feet.

The results showed that there were no statistical differences in the flight crews’ performance in terms of touchdown and takeoff. Furthermore, there were no statistical differences between the HUD and HWD in the pilots’ responses to questionnaires.

4.5.2.

Flight demonstration

Using the same HWD system described above, a flight demonstration was conducted at NASA LaRC.44 Ideally, the test would be designed as a direct HUD versus HWD comparison, but since NASA’s flight test vehicle (a BE-200 King Air) did not have a HUD installed, the purpose of the flight test was primarily to evaluate the use of HWDs during actual aircraft taxi and approach operations. Approach and taxi testing was performed in both visual and simulated instrument conditions. Seven highly experienced test pilots with HUD experience participated in the flight test and were asked to compare the HWD that they were flying to that of a HUD based on their previous experience.

The pilots flew straight-in ILS approaches wearing the HWD. The HWD symbology consisted of a “virtual-HUD” concept where a typical HUD symbology was rendered if the pilot was looking at the area where a HUD combiner glass would be mounted. In addition to typical flight symbology, a simulated EVS image was rendered on the HWD and was conformal to the outside world. During simulated instrument conditions, the pilots view through the HWD was blocked, leaving only symbology and simulated EVS imagery (EVS on/off was experimentally controlled).

Pilots were able to fly the approaches and stay within a dot (i.e., “within a dot” precision implies the aircraft will be at the proper position at decision height for a safe landing) of precision on the localizer and glideslope using the HWD. Pilot comments showed acceptance of the concept, but they provided feedback comments where they indicated areas to improve the HWD system. The most requested improvement was to stabilize the symbology, which essentially translates to reducing the system latency.

For pilots that encountered light to moderate turbulence in visual conditions, system latency created a “jittery, bouncy” display that was difficult to read and follow. The latency combined with turbulence resulted in eye strain and headaches (although minimal), causing increased workload due to temporary discrepancies in the conformal image to the actual image. When pilots flew in simulated instrument conditions, the EVS imagery was not misaligned with the “real world” since it was not visible; thus, no eye strain, even on turbulent approaches, was reported. Pilots reported that this configuration reduced workload significantly and pilots commented they were able to focus on pertinent information much more easily. Although the flight test showed promise, continued research and development is needed.

5.

Future Directions

The work to date indicates that an HWD for commercial and business aircraft is viable. The data also suggest that the business case to make this happen is through the path of “HUD equivalency.” The data further show that the technology is not ready quite yet. The form factor and static and dynamic accuracies are not where they need to be for HUD equivalence. The other open question is how to best meet the challenge of obscuration and contamination of the pilot’s view outside the aircraft when there is prevailing natural vision, such as surface operations.

The application domain for this technology lays strongly in three areas: (a) HUD installation is not possible or practical by volume or weight; (b) HUD retro-fit is not cost effective; and (c) HWD installation has a return-on-investment advantage to the HUD, such as a weight reduction. Once installed, the HWD-equipped aircraft can then pursue HUD operational credits for reduced operating minima for landing and take-off. More importantly, if the HWD can get installed as a HUD equivalent, the data suggest that several HWD-unique applications will open up. Improvements have been shown for surface operations and in flight operations where off-boresight and expanded FOV information such as traffic identification and extended runway centerline awareness are important.

The other “game-changing” application could be enabling an emerging NextGen concept termed, equivalent visual operations (EVO). EVO is an electronic means to provide sufficient visibility of the external world and other required flight references that enable the safety, operational tempos, and visual flight-like procedures for all weather conditions. The HWD, coupled with SV and EVS technologies, would create an intuitive interface or an electronic visual flight rules (E-VFR) operational capability. In a step further, NASA is conducting research and technology development to expand EVO as one component of a “better-than-visual” operational capability; replicating the capacity of today’s visual flight operations and, more importantly, meeting and improving the safety of today’s operations in all-weather NextGen conditions.

The data and experience pertains possibly to other AR applications as well. For instance, the data would be applicable to safety-critical automotive applications where visual attention, field-of-regard, and AR cueing can be beneficial such as during street navigation or collision alerting. The research is also applicable concerning the design of HWDs to an aging user population and the challenge of creating near-to-eye and AR displays for use a population with less-than-perfect vision, unlike the previous HMD literature. Finally, the aviation challenge requires high-accuracy (dynamic and static) tracking systems. This requirement needs to be addressed in the consumer world, for positive AR experiences, and for other safety-critical activities such as aided medical devices or diagnosis and surgery.

References

1. 

J. J. Arthuret al., “A review of head-worn display research at NASA Langley Research Center,” Proc. SPIE, 9470 94700W (2015). https://doi.org/http://doi.org.dx/10.1117/12.2180436 Google Scholar

2. 

J. Shin, “The NASA aviation safety program: overview,” Cleveland, Ohio (2000). Google Scholar

3. 

Advancing Aeronautical Safety: A Review of NASA’s Aviation Safety—Related Research Programs, National Academies Press, Washington, D.C. (2010). Google Scholar

4. 

M. Velger, Helmet Mounted Displays & Sights, Artech House, Norwood, Massachusetts (1998). Google Scholar

5. 

R. Bailey, K. Shelton and III J. Arthur, “Head-worn displays for NextGen,” Proc. SPIE, 8041 80410G (2011). http://dx.doi.org/10.1117/12.885847 PSISDG 0277-786X Google Scholar

6. 

F. Cupero et al., “Head worn display system for equivalent visual operations,” Hampton, Virginia (2009). Google Scholar

7. 

H. Li et al., “Review and analysis of avionic helmet-mounted displays,” Opt. Eng., 52 110901 (2013). http://dx.doi.org/10.1117/1.OE.52.11.110901 Google Scholar

8. 

L. J. Bjarke, J. H. D. Frate and D. F. Fisher, “A summary of the forebody high-angle-of-attack aerodynamics research on the F-18 and the X-29A aircraft,” Edwards, California (1992). Google Scholar

9. 

J. Burley and J. LaRussa, “A full-color wide-field-of-view holographic helmet-mounted display for piloted/vehicle interface development and human factors studies,” Proc. SPIE, 1290 9 –15 (1990). http://dx.doi.org/10.1117/12.20950 PSISDG 0277-786X Google Scholar

10. 

S. A. Viken and J. R. Burley, “Predictive nosepointing and flightpath displays for air-to-air combat,” Proc. SPIE, 1695 154 –165 (1992). http://dx.doi.org/10.1117/12.131960 PSISDG 0277-786X Google Scholar

11. 

D. R. Jones, T. S. Abbott and II J. R. Burley, “Concepts for conformal and body-axis attitude information for spatial awareness presented in a helmet-mounted display,” Hampton, Virginia (1993). Google Scholar

12. 

D. Littman and D. Boehm-Davis, “Perceptual factors that influence use of computer enhanced visual displays,” Hampton, Virginia (1993). Google Scholar

13. 

J. W. Clark, “Integrated helmet mounted display concepts for air combat,” Hampton, Virginia (1995). Google Scholar

14. 

III J. Arthur et al., “Flight simulator evaluation of display media devices for synthetic vision concepts,” Proc. SPIE, 5442 213 –224 (2004). http://dx.doi.org/10.1117/12.541387 PSISDG 0277-786X Google Scholar

15. 

T. H. Harding et al., “Evaluation of the microvision helmet-mounted display technology for synthetic vision application engineering prototype for the virtual cockpit optimization program,” Fort Rucker, Alabama (2003). Google Scholar

16. 

R. Bailey, L. Kramer and III L. Prinzel, “Crew and display concepts evaluation for synthetic / enhanced vision systems,” Proc. SPIE, 6226 62260G (2006). http://dx.doi.org/10.1117/12.666711 PSISDG 0277-786X Google Scholar

17. 

III J. J. Arthur et al., “Flight simulator evaluation of synthetic vision display concepts to prevent controlled flight into terrain (CFIT),” Hampton, Virginia (2004). Google Scholar

18. 

M. A. Burgess, “Synthetic vision technology demonstration,” Washington, D.C. (1993). Google Scholar

19. 

M. A. Burgess and R. D. Hayes, “Synthetic vision-a view in the fog,” Aerosp. Electron. Syst. Mag., 8 6 –13 (1993). http://dx.doi.org/10.1109/62.199814 Google Scholar

20. 

Head-up guidance system technology—a clear path to increasing flight safety,” Alexandria, Virginia (2009). Google Scholar

21. 

III J. J. Arthur et al., “Performance comparison between a head-worn display system and a head-up display for low visibility commercial operations,” Proc. SPIE, 9086 90860N (2014). http://dx.doi.org/10.1117/12.2048700 PSISDG 0277-786X Google Scholar

22. 

T. Frey and H. Page, “Virtual HUD using an HMD,” Proc. SPIE, 4361 251 –262 (2001). http://dx.doi.org/10.1117/12.438000 PSISDG 0277-786X Google Scholar

23. 

K. J. Shelton et al., “Synthetic and enhanced vision systems for NextGen (SEVS) simulation and flight test performance evaluation,” in Proc. 31st Digital Avionics Systems Conf. (DASC), 2D5 (2012). http://dx.doi.org/10.1109/DASC.2012.6382967 Google Scholar

24. 

R. E. Bailey et al., “Evaluation of head-worn display concepts for commercial aircraft taxi operations,” Proc. SPIE, 6557 65570Y (2007). http://dx.doi.org/10.1117/12.717221 PSISDG 0277-786X Google Scholar

25. 

J. Capo-Apone et al., “Visual perception and cognitive performance,” Helmet-Mounted Displays: Sensation, Perception and Cognition Issues, 335 –390 U.S. Army Aeromedical Research Laboratory, Fort Rucker, Alabama (2009). Google Scholar

26. 

D. R. Jones, “Runway incursion prevention system testing at the Wallops flight facility,” Proc. SPIE, 5802 47 (2005). http://dx.doi.org/10.1117/12.602327 Google Scholar

27. 

III J. Arthur et al., “Synthetic vision enhanced surface operations and flight procedures rehearsal tool,” Proc. SPIE, 6226 62260I (2006). http://dx.doi.org/10.1117/12.666636 PSISDG 0277-786X Google Scholar

28. 

III J. Arthur et al., “Design and testing of an unlimited field-of-regard synthetic vision head-worn display for commercial aircraft surface operations,” Proc. SPIE, 6559 65590E (2007). http://dx.doi.org/10.1117/12.719695 PSISDG 0277-786X Google Scholar

29. 

III J. J. Arthur et al., “Head-worn display concepts for surface operations for commercial aircraft,” Hampton, Virginia (2008). Google Scholar

30. 

J. Arthur et al., “Synthetic vision enhanced surface operations with head-worn display for commercial aircraft,” Int. J. Aviat. Psychol., 19 (2), 158 –181 (2009). http://dx.doi.org/10.1080/10508410902766507 Google Scholar

31. 

III J. J. Arthur et al., “US patent 7, 737, 867: multi-modal cockpit interface for improved airport surface operation,” (2010). Google Scholar

32. 

T. Doyle and F. McGee, “Air traffic and operational data on selected U.S. airports with parallel runways,” Hampton, Virginia (1998). Google Scholar

33. 

III J. J. Arthur et al., “Enhanced/synthetic vision and head-worn display technologies for terminal maneuvering area for NextGen operations,” Proc. SPIE, 8042 80420Q (2011). http://dx.doi.org/10.1117/12.883036 PSISDG 0277-786X Google Scholar

34. 

Aircraft display symbology,” Wright-Patterson AFB, Ohio (1996). Google Scholar

35. 

Draft constellation programmatic environmental impact statement,” Washington, DC (2007). Google Scholar

36. 

S. Williams et al., “Synthetic vision for lunar and planetary landing vehicles,” Proc. SPIE, 6957 695706 (2008). http://dx.doi.org/10.1117/12.777079 PSISDG 0277-786X Google Scholar

37. 

L. Kramer et al., “The effects of synthetic and enhanced vision technologies for lunar landings,” in Proc. 28th Digital Avionics Systems Conf., (2009). Google Scholar

38. 

III J. J. Arthur et al., “Part-task simulation of synthetic and enhanced vision concepts for lunar landing,” Proc. SPIE, 7689 768904 (2010). http://dx.doi.org/10.1117/12.852917 PSISDG 0277-786X Google Scholar

39. 

R. E. Bailey, E. B. Jackson and III J. J. Arthur, “Handling qualities implications for crewed spacecraft operations,” in Proc. IEEE Aerospace Conf., (2012). http://dx.doi.org/10.1109/AERO.2012.6187282 Google Scholar

40. 

R. E. Bailey, III J. J. Arthur and S. Williams, “Latency requirements for head-worn display S/EVS applications,” Proc. SPIE, 5424 98 –109 (2004). http://dx.doi.org/10.1117/12.554462 PSISDG 0277-786X Google Scholar

41. 

R. Patterson, M. Winterbottom and B. Pierce, “Perceptual issues in the use of Head-Mounted visual displays,” Mesa, Arizona (2006). Google Scholar

42. 

P. Ververs and C. Wickens, “Designing head-up displays (HUDs) to support flight path guidance while minimizing effects of cognitive tunneling,” in Proc. Human Factors and Ergonomics Society Annual Meeting, 45 –48 (2000). http://dx.doi.org/10.1177/154193120004401312 Google Scholar

43. 

Minimum aviation system performance standards (MASPS) for enhanced vision systems synthetic vision systems, combined vision systems and enhanced flight vision systems,” Washington, D.C. (2008). Google Scholar

44. 

K. J. Shelton et al., “Flight test of a head-worn display as an equivalent-HUD for terminal operations,” Proc. SPIE, 9470 94700X (2015). https://doi.org/http://doi.org.dx/10.1117/12.2177059 Google Scholar

Biography

Jarvis (Trey) J. Arthur III holds an MS degree in aeronautical engineering from George Washington University and has been conducting human/machine interface research at NASA for the past 20 years. He is the lead for head-worn display research in the Crew Systems and Aviation Operations Branch at the NASA Langley Research Center.

Randall E. Bailey is a lead aerospace engineer at NASA's Langley Research Center. He serves as the team lead for Flight Deck Interface Technologies, conducting and leading others in the research, development, test, evaluation of cockpit display, handling qualities, and pilot-vehicle interface systems for aviation and space domain applications. Prior to joining NASA, he was the technical director of the Flight Research Group at the Calspan Corporation.

Steven P. Williams holds a BS degree in electrical engineering from the University of Tennessee and has been conducting vision system and human/machine interface research at NASA for the past 30 years. He is a senior researcher in the Crew Systems and Aviation Operations Branch at the NASA Langley Research Center.

Lawrence J. Prinzel III is a NASA aerospace engineer who holds a PhD in Industrial-Organizational Psychology and an MS degree in aeronautics. He has conducted research in space human factors and commercial flight deck technologies. He has over 150 peer-reviewed publications and holds six patents. Currently, his research project is focused on commercial aircraft flight deck technologies to prevent control upset accidents and to enhance flight crew energy and attitude state awareness.

Kevin J. Shelton is a researcher at NASA Langley Research Center. He specializes in advanced flight deck and display design aimed at improving safety, efficiency, and human/machine interfaces. He has served at Langley as a flight test engineer for 11 years, and then as an aviation safety researcher for 12 years.

Denise R. Jones is a senior researcher in the Crew Systems and Aviation Operations Branch at NASA Langley Research Center. For the majority of her career, she has served as the lead for runway incursion prevention and airport surface safety and capacity research, which included piloted simulation studies and flight testing. She also contributed to the development of standards for aircraft-based indications and alerting for runway safety as part of RTCA efforts.

Vincent E. Houston holds degrees in both computer engineering and modeling and simulation from Old Dominion University. He is the lead developer of increasing autonomous system and the Virtual Imaging Simulator for Transport Aircraft manager. He has been, successively, an engineering technician, test engineer, and research engineer at the NASA Langley Research Center over his 27 years with NASA Langley.

© 2017 Society of Photo-Optical Instrumentation Engineers (SPIE)
Jarvis (Trey) J. Arthur, Randall E. Bailey, Steven P. Williams, Lawrence J. Prinzel, Kevin J. Shelton, Denise R. Jones, and Vincent E. Houston "Review of head-worn displays for the Next Generation Air Transportation System," Optical Engineering 56(5), 051405 (1 February 2017). https://doi.org/10.1117/1.OE.56.5.051405
Received: 28 July 2016; Accepted: 4 January 2017; Published: 1 February 2017
Lens.org Logo
CITATIONS
Cited by 16 scholarly publications and 1 patent.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Heads up displays

Head

Head-mounted displays

Visualization

Visibility

Safety

Prototyping

RELATED CONTENT


Back to Top