KEYWORDS: Long wavelength infrared, Signal to noise ratio, Hyperspectral imaging, Data acquisition, Atmospheric physics, Black bodies, Image retrieval, Image classification, Library classification systems, Temperature metrology
ASSET is based on physical first principles and was developed using synthetic data. The method treats each pixel independently, assumes homogeneous, isothermal pixels and requires the following inputs: 1) Hyperspectral LWIR radiance imagery, 2) Atmospheric parameters (downwelling irradiance, upwelling radiance, and transmissivity), and 3) A library of material emissivities. For each pixel, the method determines the most appropriate material from the emissivity library. The method computes the pixel temperature assuming pure pixels. Then, the pixel temperature is used to determine the emissivity. Note that the computed emissivity may differ from that of the selected library material due to a variety of factors such as noise, mixed pixels, natural spectral variability, and inadequate atmospheric compensation.
The synthetic data used to develop ASSET were constructed by computing the thermally emitted radiances of a set of materials with specificed emissivities at a range of temperatures. A given set of atmospheric parameters was then applied to the radiances to obtain at-aperature radiance. Random additive gaussian noise was applied to the data. ASSET was run using the synthetic data, as well as additional materials. The initial results from ASSET are promising. With a signal-to-noise ratio (SNR) of 500, the material was correctly classified 100% of the time. The mean absolute temperature error for this case was 0.02 K with a standard deviation of 0.02. The maximum absolute temperature error was 0.12 K. With a SNR of 300, the material was correctly classified more than 99% of the time. The mean absolute temperature error for this case was 0.04 K with a standard deviation of 0.03. The maximum absolute temperature error was 1.07 K.
We present results from a simple synthetic data set as well as results from applying ASSET to more sophisticated synthetic DIRSIG LWIR imagery.
KEYWORDS: Signal to noise ratio, Data compression, Image compression, Interference (communication), Principal component analysis, Filtering (signal processing), Detection and tracking algorithms, Spectroscopy, Hyperspectral imaging, Sensors
Proc. SPIE. 4540, Sensors, Systems, and Next-Generation Satellites V
KEYWORDS: Signal to noise ratio, Principal component analysis, Data compression, Image compression, Detection and tracking algorithms, Reflectivity, Reconstruction algorithms, Niobium, Filtering (signal processing), Algorithms
Storage and transmission requirements for hyperspectral data sets are significant. In order to reduce hardware costs, well-designed compression techniques are needed to preserve information content while maximizing compression ratios. Lossless compression techniques maintain data integrity, but yield small compression ratios. This paper presents three lossy compression algorithms that use the noise statistics of the data to preserve information content while maximizing compression ratios. The Spectral Compression and Noise Suppression (SCANS) algorithm adapts a noise estimation technique to exploit band-to-band correlation for optimizing linear prediction for data compression. The Adaptive Spectral Image Compression (ASIC) algorithm uses an iterative adaptive linear unmixing compression method, constrained by the noise statistics of the hypercube. By dynamically optimizing the end-members for each pixel this method minimizes the number of components required to represent the spectrum of any given pixel, yielding high compression ratios with minimal information content loss. The Adaptive Principal Components Analysis (APCA) algorithm uses noise statistics to determine the number of significant principal components and selects only those that are required to represent each pixel to within the noise level. We demonstrate the effectiveness of these methods with AVIRIS and HYMAP datasets.
12 Hyperspectral images are becoming more common and have considerable information content. Analysis tools must keep up with the changing demands and opportunities posed by the new datasets. Traditional tools such as image compression, and classification, both supervised and unsupervised, can be improved. Newly developed tools will enhance the commercial value of the data (e.g. tools capable of objective data cube quality evaluation). This paper discusses several new or improved analysis tools developed for use with hyperspectral images. The algorithm fundamental to many of these tools is the Spectral Similarity Scale (SSS). The SSS is an objective measure of spectral distance that quantifies differences in both magnitude (albedo) and direction (shape). This is a fundamental improvement in the description of distance between two spectra. The toolset described in this paper consists of: 1) image quality evaluation, which is an objective measurement of information content, image complexity, and subtlety; 2) hyperspectral image compression, which facilitates image storage and transmission; 3) processing-induced spectral change objectively quantifies spectral changes caused by image processing such as lossy compression. Using the SSS as the measure of spectral distance improves the performance of both supervised classification and unsupervised classification algorithms.