Synthetic vision systems are becoming common in the business jet community. The perspective display of terrain information provides a display of complex information in a visual manner that pilots are accustomed to. Research and flight testing is underway to allow low noise supersonic business jet operations. Widespread acceptance will require regulatory changes, the ability for pilots to predict, and manage where the generated sonic boom will impact people on the ground. A display of the sonic boom impact will be needed for preflight and inflight planning. This paper details the CONOPS, algorithm development, and human machine considerations of a synthetic vision display design incorporating a sonic boom carpet. Using a NASA developed algorithm, sonic boom prediction, Mach cut-off, and sound pressure levels are calculated for current and modified flights plans. The algorithm information is transformed into georeferenced objects, presented on navigation and guidance displays, where pilots can determine whether the current flightplan avoids the generation of sonic booms in noise-sensitive areas. If pilots maneuver away from the flightplan, a dynamically computed predicted boom carpet is presented in which the algorithm is fed an extrapolation of the current flightpath. The resulting depiction is a sonic boom footprint which changes location as the aircraft maneuvers. Using a certain lookahead time for the prediction, the pilot has the ability to shift the location where boom intensity will be at a maximum. Considerations of allowable sound levels for various locations on the ground are incorporated for comparison of the realtime and predicted sonic boom.
Synthetic Vision (SV) and Enhanced Flight Vision Systems (EFVS) may serve as game-changing technologies to meet the challenges of the Next Generation Air Transportation System and the envisioned Equivalent Visual Operations (EVO) concept – that is, the ability to achieve the safety and operational tempos of current-day Visual Flight Rules operations irrespective of the weather and visibility conditions. One significant obstacle lies in the definition of required equipage on the aircraft and on the airport to enable the EVO concept objective. A motion-base simulator experiment was conducted to evaluate the operational feasibility and pilot workload of conducting departures and approaches on runways without centerline lighting in visibility as low as 300 feet runway visual range (RVR) by use of onboard vision system technologies on a Head-Up Display (HUD) without need or reliance on natural vision. Twelve crews evaluated two methods of combining dual sensor (millimeter wave radar and forward looking infrared) EFVS imagery on pilot-flying and pilot-monitoring HUDs. In addition, the impact of adding SV to the dual sensor EFVS imagery on crew flight performance and workload was assessed. Using EFVS concepts during 300 RVR terminal operations on runways without centerline lighting appears feasible as all EFVS concepts had equivalent (or better) departure performance and landing rollout performance, without any workload penalty, than those flown with a conventional HUD to runways having centerline lighting. Adding SV imagery to EFVS concepts provided situation awareness improvements but no discernible improvements in flight path maintenance.
Several emerging technologies were recently demonstrated in a Boeing 737-900 as part of Boeing's Technology Demonstrator program. Among these technologies were two enhanced vision systems and a synthetic vision system, including synthetic displays to support surface operations. This project gained operational experience with enhanced and synthetic vision systems operating in a context that included Required Navigation Performance (RNP) terminal area operations, Global Navigation Satellite System (GNSS) approach and landing, and Integrated Area Navigation (IAN). The technologies were demonstrated to a broad mix of constituents involved in research, regulation, and acquisition in the transport category environment. This paper describes the systems demonstrated, the context in which they were used, and perceived benefits of integrating them in an operational environment. Lessons learned in the implementation of these technologies throughout the program are described and subjective data from participants are summarized.