Spectral signature coding is an effective means of characterizing spectral features. This paper develops a rather different encoding concept, called progressive signature coding (PSC), which encodes a signature in a hierarchical manner. More specifically, it progressively encodes a spectral signature in multiple stages; each of these stages captures disjoint spectral information contained in the spectral signature. As a result of this progressive coding, a spectral profile of progressive changes in a spectral signature can be generated for spectral characterization. The proposed idea is very simple and evolved from the pulse code modulation (PCM) commonly used in communications and signal processing. It expands PCM to multistage PCM (MPCM) in the sense that a signature can be decomposed and quantized by PCM progressively in multiple stages for spectral characterization. In doing so, the MPCM generates a priority code for a spectral signature so that its spectral information captured in different stages can be prioritized in accordance with significance of changes in spectral variation. Such MPCM-based progressive spectral signature coding (MPCM-PSSC) can be useful in applications such as hyperspectral data exploitation, environmental monitoring, and chemical/biological agent detection. Experiments are provided to demonstrate the utility of the MPCM-PSSC in signature discrimination and identification.
Independent component analysis (ICA) has shown success in many applications. This paper investigates a new
application of the ICA in endmember extraction and abundance quantification for hyperspectral imagery. An
endmember is generally referred to as an idealized pure signature for a class whose presence is considered to be rare.
When it occurs, it may not appear in large population. In this case, the commonly used principal components analysis
(PCA) may not be effective since endmembers usually contribute very little in statistics to data variance. In order to
substantiate our findings, an ICA-based approach, called ICA-based abundance quantification algorithm (ICA-AQA) is
developed. Three novelties result from our proposed ICA-AQA. First, unlike the commonly used least squares
abundance-constrained linear spectral mixture analysis (ACLSMA) which is a 2nd order statistics-based method, the
ICA-AQA is a high order statistics-based technique. Second, due to the use of statistical independence it is generally
thought that the ICA cannot be implemented as a constrained method. The ICA-AQA shows otherwise. Third, in order
for the ACLSMA to perform abundance quantification, it requires an algorithm to find image endmembers first then
followed by an abundance-constrained algorithm for quantification. As opposed to such a two-stage process, the ICAAQA
can accomplish endmember extraction and abundance quantification simultaneously in one-shot operation.
Experimental results demonstrate that the ICA-AQA performs at least comparably to abundance-constrained methods.
Principal components analysis (PCA) has been widely used in many applications, particularly, data compression. Independent component analysis (ICA) has been also developed for blind source separation along with many other applications such as channel equalization, speech processing. Recently, it has been shown that the ICA can be also used for hyperspectral data compression. This paper investigates these two transforms in hyperspectral data compression and further evaluates their strengths and weaknesses in applications of target detection, mixed pixel classification and abundance quantification. In order to take advantage of the strengths of both transform, a new transform, called mixed PCA/ICA transform is developed in this paper. The idea of the proposed mixed PCA/ICA transform is derived from the fact that it can integrate different levels of information captured by the PCA and ICA. In doing so, it combines m principal components (PCs) resulting from the PCA and n independent components (ICs) generated by the ICA to form a new set of (m+n) mixed components used for hyperspectral data compression. The resulting transform is referred to as mixed (m,n)-PCA/ICA transform. In order to determine the total number of components, p needed to be generated for the mixed (m,n)-PCA/ICA transform, a recently developed virtual dimensionality (VD) is introduced to estimate the p where p = m + n. If m = p and n = 0, then mixed (m,n)-PCA/ICA transform is reduced to PCA transform. On the other hand, if m = 0 and n = p, then mixed (m,n)-PCA/ICA transform is reduced to ICA. Since various combinations of m and n have different impacts on the performance of the mixed PCA/ICA spectral/spatial compression in applications, experiments based on subpixel detection and mixed pixel quantification are conducted for performance evaluation.
Hyperspectral remotely sensed imagery is rapidly developed recently. It collects radiance from the ground with hundreds of channels which results in hundreds of co-registered images. How to process this huge amount of data is a great challenge, especially when no information of the image scene is available. Under this circumstance, anomaly detection becomes more difficult. Several methods are devoted to this problem, such as the well-known RX algorithm and high-moment statistics approaches. The RX algorithm can detect all anomalies in single image but it can not discriminate them. On the other hand, the high-moment statistics approaches use criterion such as skewness and kurtosis to find the projection directions to detect anomalies. In this paper we propose an effective algorithm for anomaly detection and discrimination extended from RX algorithm, called Background Whitened Target Detection Algorithm. It first modeled the background signature with Gaussian distribution and applied the whitening process. After the process, the background will distribute as i.i.d. Gaussian in all spectral bands. Those pixels did not fit in the distribution will be the anomalies. Then Automatic Target Detection and Classification Algorithm (ATDCA) is applied to search for those distinct spectrum automatically and classify them as anomalies. Since ATDCA can also estimated the abundance fraction of each target resident in one pixel by applying Sum-to-one and Nonnegativity constraints, the proposed method can also be applied in a constrained fashion. The experimental results show that the proposed method can improve RX algorithm by discriminate the anomalies and also outperform high-moment approaches in terms of computational complexity.
Hyperspectral image compression can be performed by either 3-D compression or spectral/spatial compression. It has been demonstrated that due to high spectral resolution hyperspectral image compression can be more effective if compression is carried out spectrally and spatially in two separate stages. One commonly used spectral/spatial compression implements principal components analysis (PCA) or wavelet for spectral compression followed by a 2-D/3D compression technique for spatial compression. This paper presents another type of spectral/spatial compression technique, which uses Hyvarinen and Oja's Fast independent component analysis (FastICA) to perform spectral compression, while JPEG2000 is used for 2-D/3-D spatial compression. In order to determine how many independent components are required, a newly developed concept, virtual dimensionality (VD) is used. Since the VD is determined by the false alarm probability rather than the commonly used signal-to-noise ratio or mean squared error (MSE), our proposed FastICA-based spectral/spatial compression is more effective than PCA-based or wavelet-based spectral/spatial compression in data exploitation.
There is an immediate need for the ability to detect, identify and quantify chemical and biological agents in water supplies during water point selection, production, storage, and distribution to consumers. Through a U.S. Army sponsored Joint Service Agent Water Monitor (JSAWM) program, based on hand-held assays that exist in a ticket format, we are developing new algorithms for automatic processing of tickets. In previous work, detection of control dots in the tickets was carried out by traditional image segmentation approaches such as Otsu's method and other entropy-based thresholding techniques. In experiments, it was found that the approaches above were sensitive to illumination effects in the camera reader. As a result, more robust, object-oriented approaches to detect the control dots are required. Mathematical morphology is a powerful technique for image analysis that focuses on the size and shape of the objects in the scene. In this work, we describe a novel application of morphological operations in identification of control dots in hand held assay ticket imagery. Such images were pre-processed by a light compensation algorithm prior to morphological analysis. The performance of the proposed approach is evaluated using Receiving Operating Characteristics (ROC) analysis.
Unsupervised image classification for remotely sensed imagery is very challenging due to the fact that the unknown image background generally varies with a wide range of spectral deviations. Additionally, spectral similarity among subtle small calsses also causes tremendous difficulty in classification. This paper investigates three major issues, (1) image background removal, (2) generation of training sample data, (3) determination of the number of classes to be classified, p which are encountered in unsupervised image classification. The study on these three issues is conducted via a well-known Airborne Visible InfraRed Imaging Spectrometer (AVIRIS) image scene, Indiana Pine test site available online at Purdue University's website. Since image background varies with different applications, it is generally difficult to perform background removal without prior knowledge. In order for unsupervised classification to be effective, a good set of training data is also necessary. These training samples must be generated directly from the image data in an unsupervised manner. This paper develops an unsupervised training sample generation algorithm (UTSGA) that can generate a good sample pool of training data for supervised classification. In determining p, a newly developed concept, called virtual dimensionality (VD) is used to estimate the p where a Neyman-Pearson-based eigen-analysis approach developed by Harsanyi, Farrand and Chang, called noise-whitened HFC (NWHFC)'s method, is implemented to find the VD to be used for the p. Finally, an unsupervised image classification algorithm can be derived by implementing a supervised classifier in conjunction with teh UTSGA algorithm and NWHFC's method.
Binary coding is one of simplest ways to characterize spectral features. One commonly used method is a binary coding-based image software system, called Spectral Analysis Manager (SPAM) for remotely sensed imagery developed by Mazer et al. For a given spectral signature, the SPAM calculates its spectral mean and inter-band spectral difference and uses them as thresholds to generate a binary code word for this particular spectral signature. Such coding scheme is generally effective and also very simple to implement. This paper revisits the SPAM and further develops three new SPAM-based binary coding methods, called equal probability partition (EPP) binary coding, halfway partition (HP) binary coding and median partition (MP) binary coding. These three binary coding methods along with the SPAM well be evaluated for spectral discrimination and identification. In doing so, a new criterion, called a posteriori discrimination probability (APDP) is also introduced for performance measure.