The GOES-R flight project has developed the Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) to perform independent INR evaluations of the optical instruments on the GOES-R series spacecraft. In this paper, we document the development of navigation (NAV) evaluation capabilities within IPATS for the Geostationary Lightning Mapper (GLM). We also discuss the post-processing quality filtering developed for GLM NAV, and present example results for several GLM datasets. Initial results suggest that GOES-16 GLM is compliant with navigation requirements.
Environmental Data Records (EDR) from the Visible Infrared Imaging Radiometer Suite (VIIRS) have a need for Reflective Solar Band (RSB) calibration errors of less than 0.1%. Throughout the mission history of VIIRS, the overall instrument calibrated response scale factor (F factor) has been calculated with a manual process that uses data at least one week old and up to two weeks old until a new calibration Look Up Table (LUT) is put into operation. This one to two week lag routinely adds more than 0.1% calibration error. In this paper, we discuss trending the solar diffuser degradation (H factor), a key component of the F factor, improving H factor accuracy with improved bidirectional reflectance distribution function (BRDF) and attenuation screen LUTs , trending F factor, and how using RSB Automated Calibration (RSBAutoCal) will eliminate the lag and look-ahead extrapolation error.
The VIIRS radiometric calibration approach relies on views of space above the earth limb to estimate a “zero offset” which is subtracted from the other instrument views (earth, solar diffuser, on board black body). This zero offset estimation is compromised when the Moon lies within the space view. The current calibration approach has a conservative method to determine when the space view is contaminated or potentially contaminated by the Moon. We outline a new approach to detecting lunar contamination, and for estimating the zero offset for contaminated scans. Our approach offers the potential to greatly reduce the number of scans classified as lunar contaminated and thus of lower quality due to the alternate calibration process used in the current operational approach. Thus, such an alternative approach could increase the number of nominal, high quality VIIRS scans available for science analyses.
The Suomi National Polar Orbiting Partnership (S-NPP) Visible Infrared Imaging Radiometer Suite (VIIRS) employs a large number of temperature and voltage sensors (telemetry points) to monitor instrument health and performance. We have collected data and built tools to study telemetry and calibration parameters trends. The telemetry points are organized into groups based on locations and functionalities. Examples of the groups are: telescope motor, focal plane array (FPA), scan cavity bulkhead, radiators, solar diffuser and Solar Diffuser Stability Monitor (SDSM). We have performed daily monitoring and long-term trending studies. Daily monitoring processes are automated with alarms built into the software to indicate if pre-defined limits are exceeded. Long-term trending studies focus on instrument performance and sensitivities of Sensor Data Record (SDR) products and calibration look-up tables (LUTs) to instrument temperature and voltage variations. VIIRS uses a DC Restore (DCR) process to periodically correct the analog offsets of each detector of each spectral band to ensure that the FPA output signals are always within the dynamic range of the Analog to Digital Converter (ADC). The offset values are updated based on observations of the On-Board Calibrator Blackbody source. We have performed a long-term trend study of DCR offsets and calibration parameters to explore connections of the DCR offsets with onboard calibrators. The study also shows how the instrument and calibration parameters respond to the VIIRS Petulant Mode, spacecraft (SC) anomalies and flight software (FSW) updates. We have also shown that trending studies of telemetry and calibration parameters may help to improve the instrument calibration processes and SDR Quality Flags.
Proc. SPIE. 8044, Sensors and Systems for Space Applications IV
KEYWORDS: Radar, Infrared search and track, Sensors, Satellites, Error analysis, Monte Carlo methods, Missiles, Microwave radiation, Single photon emission computed tomography, Received signal strength
A potentially high payoff for the ballistic missile defense system (BMDS) is the ability to fuse the information gathered
by various sensor systems. In particular, it may be valuable in the future to fuse measurements made using ground based
radars with passive measurements obtained from satellite-based EO/IR sensors. This task can be challenging in a multitarget
environment in view of the widely differing resolution between active ground-based radar and an observation
made by a sensor at long range from a satellite platform. Additionally, each sensor system could have a residual
pointing bias which has not been calibrated out. The problem is further compounded by the possibility that an EO/IR
sensor may not see exactly the same set of targets as a microwave radar. In order to better understand the problems
involved in performing the fusion of metric information from EO/IR satellite measurements with active microwave radar
measurements, we have undertaken a study of this data fusion issue and of the associated data processing techniques. To
carry out this analysis, we have made use of high fidelity simulations to model the radar observations from a missile
target and the observations of the same simulated target, as gathered by a constellation of satellites. In the paper, we
discuss the improvements seen in our tests when fusing the state vectors, along with the improvements in sensor bias
estimation. The limitations in performance due to the differing phenomenology between IR and microwave radar are
discussed as well.