Surface air pressure is the most important atmospheric variable for atmospheric dynamics. It is regularly measured by in-situ meteorological sensors, and there are no operational capabilities that could remotely sense the pressure over the globe. The poor spatiotemporal coverage of this dynamically crucial variable is a significant observational gap in weather predictions. To improve forecasts of severe weather conditions, especially the intensity and track of tropical storms, large spatial coverage and frequent sampling of surface barometry are critically needed for numerical weather forecast models. Recent development in remote sensing techniques provides a great hope of atmospheric barometry in large spatiotemporal scales.
Currently, NASA Langley Research Center tries to use the concept of Differential-absorption Barometric Radar (DiBAR) working at the 50-56 GHz O2 absorption bands to fill the observational gap. The numerical simulation shows that with this DiBAR remote sensing system, the uncertainty in instantaneous radar surface air pressure estimates can be as low as ~1 mb. Prototype instrumentation and its related laboratory, ground and airborne experiments indicate that satellite DiBAR remote sensing systems will obtain needed air pressure observations and meet or exceed the science requirements for surface air pressure fields. Observational system simulation experiments (OSSEs) for space DiBAR performance based on the existing DiBAR technology and capability show substantial improvements in tropical storm predictions, not only for the typhoon track and position but also for the typhoon intensity. Satellite DiBAR measurements will provide an unprecedented level of the prediction and knowledge on global extreme weather conditions.
A space multi-frequency differential oxygen absorption radar system will fill the gap in the global observations of atmospheric air pressure, increase our knowledge in the dynamics, and significantly improve weather, especially severe weather such as typhoon and hurricane, predictions. Advanced tropical storm forecasts are expected with the studied capability. The development of the DiBAR system and associated OSSE results will be presented.
Flying in poor visibility conditions, such as rain, snow, fog or haze, is inherently dangerous. However these
conditions can occur at nearly any location, so inevitably pilots must successfully navigate through them. At
NASA Langley Research Center (LaRC), under support of the Aviation Safety and Security Program Office
and the Systems Engineering Directorate, we are developing an Enhanced Vision System (EVS) that combines
image enhancement and synthetic vision elements to assist pilots flying through adverse weather conditions. This
system uses a combination of forward-looking infrared and visible sensors for data acquisition. A core function
of the system is to enhance and fuse the sensor data in order to increase the information content and quality
of the captured imagery. These operations must be performed in real-time for the pilot to use while flying. For
image enhancement, we are using the LaRC patented Retinex algorithm since it performs exceptionally well for
improving low-contrast range imagery typically seen during poor visibility poor visibility conditions. In general,
real-time operation of the Retinex requires specialized hardware. To date, we have successfully implemented a
single-sensor real-time version of the Retinex on several different Digital Signal Processor (DSP) platforms. In
this paper we give an overview of the EVS and its performance requirements for real-time enhancement and
fusion and we discuss our current real-time Retinex implementations on DSPs.
The 1997 Final Report of the 'White House Commission on Aviation Safety and Security' challenged industrial and government concerns to reduce aviation accident rates by a factor of five within 10 years. In the report, the commission encourages NASA, FAA and others 'to expand their cooperative efforts in aviation safety research and development'. As a result of this publication, NASA has since undertaken a number of initiatives aimed at meeting the stated goal. Among these, the NASA Aviation Safety Program was initiated to encourage and assist in the development of technologies for the improvement of aviation safety. Among the technologies being considered are certain sensor technologies that may enable commercial and general aviation pilots to 'see to land' at night or in poor visibility conditions. Infrared sensors have potential applicability in this field, and this paper describes a system, based on such sensors, that is being deployed on the NASA Langley Research Center B757 ARIES research aircraft. The system includes two infrared sensors operating in different spectral bands, and a visible-band color CCD camera for documentation purposes. The sensors are mounted in an aerodynamic package in a forward position on the underside of the aircraft. Support equipment in the aircraft cabin collects and processes all relevant sensor data. Display of sensor images is achieved in real time on the aircraft's Head Up Display (HUD), or other display devices.