Future satellite weather instruments such as high spectral resolution imaging interferometers pose a challenge to the atmospheric science and software development communities due to the immense data volumes they will generate. An open-source, scalable reference software implementation demonstrating the calibration of radiance products from an imaging interferometer, the Geosynchronous Imaging Fourier Transform Spectrometer1 (GIFTS), is presented. This paper covers essential design principles laid out in summary system diagrams, lessons learned during implementation and preliminary test results from the GIFTS Information Processing System (GIPS) prototype.
The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) instrument is a hyperspectral sounder slated to undergo thermal vacuum testing within a year. The University of Wisconsin - Madison is authoring a software suite to answer the requirement of testing the conversion of raw interferogram images into calibrated high-resolution spectra. The software consists of algorithm components that assemble into a processing pipeline as well as a testing harness utilizing a lightweight scripting language. The processing requirements for an imaging FTS are considerable, and necessitate an understanding of maximum achievable accuracy as well as exploration of tradeoffs in the interest of processing efficiency. We present an overview of the design of this testing software.
Future meteorological sounding instrumentation for aircraft and satellite platforms will include hyperspectral imaging infrared spectrometers with high time and space resolution, capable of providing terabytes of raw data per day. In tandem with the development of the instruments themselves, corresponding software must be architected to be capable of timely, efficient and accurate processing of the raw data produced. Design candidates for such a software architecture must respond to use cases including deployment in large-scale distributed production environments with stringent reliability specifications; phasing of research algorithms through testing and validation into production use; marshalling of data product views to metadata-aware analysis and archival systems; maintenance of software supporting multiple similar instrument systems over the course of decades; and most importantly, delivery of fully annotated datasets to end-users with real-time latencies. Consistent techniques in the specification and propagation of metadata for both algorithm software and data content are of paramount concern in manipulating large quantities of data over long stretches of time. Further, long-term maintainability and cost-effectiveness of the system can be assured by improving reusability of both systems software and science software, through defining well-specified interfaces for software components and implementing automated mechanisms for integration and testing. We illustrate current design work, avenues of research and lessons learned on a software component architecture and corresponding development practices addressing the aforementioned concerns.