Leaf maturation from initiation to senescence is a phenological event of plants that is a result of the influences of temperature and water availability on physiological activities during a life cycle. Detection of newly grown leaves (NGL) is therefore useful in diagnosis if growth of trees, tree stress and even climatic change. There are many important applications that can naturally be modeled as a low-rank plus a sparse contribution. This paper develop a new algorithm and application to detect NGL. It uses first sparse matrix as a preprocessing to enhance target and applied deep learning to segment the image. The experimental results show that our proposed method can detect targets effectively and decrease false alarm rate.
Band selection (BS) is one of the most important topics in hyperspectral image (HSI) processing. The objective of BS is to find a set of representative bands that can represent the whole image with lower inter-band redundancy. Many types of BS algorithms were proposed in the past. However, most of them can be carried on in an off-line manner. It means that they can only be implemented on the pre-collected data. Those off-line based methods are sometime useless for those applications that are timeliness, particular in disaster prevention and target detection. To tackle this issue, a new concept, called progressive sample processing (PSP), was proposed recently. The PSP is an "on-line" framework where the specific type of algorithm can process the currently collected data during the data transmission under band-interleavedby-sample/pixel (BIS/BIP) protocol. This paper proposes an online BS method that integrates a sparse-based BS into PSP framework, called PSP-BS. In PSP-BS, the BS can be carried out by updating BS result recursively pixel by pixel in the same way that a Kalman filter does for updating data information in a recursive fashion. The sparse regression is solved by orthogonal matching pursuit (OMP) algorithm, and the recursive equations of PSP-BS are derived by using matrix decomposition. The experiments conducted on a real hyperspectral image show that the PSP-BS can progressively output the BS status with very low computing time. The convergence of BS results during the transmission can be quickly achieved by using a rearranged pixel transmission sequence. This significant advantage allows BS to be implemented in a real time manner when the HSI data is transmitted pixel by pixel.
Pesticide residue detection in agriculture crops is a challenging issue and is even more difficult to quantify pesticide residue resident in agriculture produces and fruits. This paper conducts a series of base-line experiments which are particularly designed for three specific pesticides commonly used in Taiwan. The materials used for experiments are single leaves of vegetable produces which are being contaminated by various amount of concentration of pesticides. Two sensors are used to collected data. One is Fourier Transform Infrared (FTIR) spectroscopy. The other is a hyperspectral sensor, called Geophysical and Environmental Research (GER) 2600 spectroradiometer which is a batteryoperated field portable spectroradiometer with full real-time data acquisition from 350 nm to 2500 nm. In order to quantify data with different levels of pesticide residue concentration, several measures for spectral discrimination are developed. Mores specifically, new measures for calculating relative power between two sensors are particularly designed to be able to evaluate effectiveness of each of sensors in quantifying the used pesticide residues. The experimental results show that the GER is a better sensor than FTIR in the sense of pesticide residue quantification.
Anomaly detection finds data samples whose signatures are spectrally distinct from their surrounding data samples. Unfortunately, it cannot discriminate the anomalies it detected one from another. In order to accomplish this task it requires a way of measuring spectral similarity such as spectral angle mapper (SAM) or spectral information divergence (SID) to determine if a detected anomaly is different from another. However, this arises in a challenging issue of how to find an appropriate thresholding value for this purpose. Interestingly, this issue has not received much attention in the past. This paper investigates the issue of anomaly discrimination which can differentiate detected anomalies without using any spectral measure. The ideas are to makes use unsupervised target detection algorithms, Automatic Target Generation Process (ATGP) coupled with an anomaly detector to distinguish detected anomalies. Experimental results show that the proposed methods are indeed very effective in anomaly discrimination.
Anomaly detection becomes increasingly important in hyperspectral data exploitation due to the use of high spectral resolution which can uncover many unknown substances that cannot be visualized or known a priori. Unfortunately, in real world applications with no availability of ground truth its effectiveness is generally performed by visual inspection which is the only means of evaluating its performance qualitatively in which case background information provides an important piece of information to help image analysts to interpret results of anomaly detection. Interestingly, this issue has never been explored in anomaly detection. This paper investigates the effect of background on anomaly detection via various degrees of background suppression. It decomposes anomaly detection into a two-stage process where the first stage is background suppression so as to enhance anomaly contrast against background and is then followed by a matched filter to increase anomaly detectability by intensity. In order to see background suppression progressively changing with data samples causal anomaly detection is further developed to see how an anomaly detector performs background suppression sample by sample with sample varying spectral correlation. Finally, a 3D ROC analysis used to evaluate effect of background suppression on anomaly detection.
Endmember variability presents a great challenge in endmember finding since a true endmember may be contaminated by many unknown factors. This paper develops a pixel purity index (PPI) based approach to resolving this issue. It is known that endmember candidates must have their PPI counts greater than 0. Using this fact we can start with all data samples with PPI counts greater than 0 and cluster them into p endmember classes where the value of p can be determined by virtual dimensionality (VD). We further develop an endmember identification algorithm to select true endmembers from these p endmembers. So, in our proposed technique three state processes are developed. It first uses PPI to produce a set of endmember candidates and then develops a clustering algorithm to group PPI-generated endmember candidates into p endmember classes and finally concludes by designing an algorithm to extract true endmembers from the p endmember classes.
Endmember extraction has recently received considerable interest in hyperspectral imagery. However, several issues in endmember extraction may have been overlooked. The first and foremost is the term of using endmember extraction. Many algorithms claimed to be endmember extraction algorithms actually do not extract true endmembers but rather find potential endmember candidates, referred to as virtual endmembers (VEs). Secondly, how difficult for an algorithm to find VEs is primarily determined by two key factors, endmember variability and endmember discriminability. While the former issue has been addressed recently in the literature, the latter issue is yet explored and has not been investigated before. This paper re-invents a wheel by developing a Fisher’s ratio approach to finding VEs using Fisher’s ratio criterion which is defined by ratio of endmember variability to endmember discriminability.
Band selection (BS) has advantages over data dimensionality in satellite communication and data transmission in the sense that spectral bands can be selected by users at their discretion for data analysis, while preserving data fidelity. However, to materialize BS in such practical applications several issues need to be addressed. One is how many bands required for BS. Another is how to select appropriate bands. A third one is how to take advantage of previously selected bands without re-implementing BS. Finally and most important one is how to process BS as number of bands varies. This paper presents a specific application to progressive band processing of anomaly detection, which does not require BS and can be carried out in a progressive fashion with data updated recursively band by band in the same way that data is processed by a Kalman filter.
Proc. SPIE. 8871, Satellite Data Compression, Communications, and Processing IX
KEYWORDS: Signal to noise ratio, Hyperspectral imaging, Minerals, Principal component analysis, Detection and tracking algorithms, Sensors, Image processing, Electroluminescence, Image analysis, Algorithm development
Virtual dimensionality (VD) has received considerable interest where VD is used to estimate the number of spectral distinct signatures, denoted by p. Unfortunately, no specific definition is provided by VD for what a spectrally distinct signature is. As a result, various types of spectral distinct signatures determine different values of VD. There is no one value-fit-all for VD. In order to address this issue this paper presents a new concept, referred to as anomaly-specified VD (AS-VD) which determines the number of anomalies of interest present in the data. Specifically, two types of anomaly detection algorithms are of particular interest, sample covariance matrix K-based anomaly detector developed by Reed and Yu, referred to as K-RXD and sample correlation matrix R-based RXD, referred to as R-RXD. Since K-RXD is only determined by 2nd order statistics compared to R-RXD which is specified by statistics of the first two orders including sample mean as the first order statistics, the values determined by K-RXD and R-RXD will be different. Experiments are conducted in comparison with widely used eigen-based approaches.
Proc. SPIE. 8743, Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery XIX
KEYWORDS: Target detection, Hyperspectral imaging, Detection and tracking algorithms, Sensors, Image processing, Data processing, Spatial resolution, Electrical engineering, Communication engineering, Real time processing algorithms
Constrained energy minimization (CEM) has been widely used for subpixel detection. It makes use of the sample correlation matrix R by suppressing the background thus enhancing detection of targets of interest. In many real world problems, implementing target detection on a timely basis is crucial, specifically moving targets. However, since the calculation of the sample correlation matrix R needs the complete data set prior to its use in detection, CEM is prevented from being implemented as a real time processing algorithm. In order to resolve this dilemma, the sample correlation matrix R must be replaced with a causal sample correlation matrix formed by only those data samples that have been visited and the currently being processed data sample. This causality is a pre-requisite to real time processing. By virtue of such causality, designing and developing a real time processing version of CEM becomes feasible. This paper presents a progressive CEM (PCEM) where the causal sample correlation matrix can be updated sample by sample. Accordingly, PCEM allows the CEM to be implemented as a causal CEM (C-CEM) as well as real time (RT) CEM via a recursive update equation in real time.
With high spectral resolution hyperspectral imaging is capable of uncovering many subtle signal sources which cannot be known a priori or visually inspected. Such signal sources generally appear as anomalies in the data. Due to high correlation among spectral bands and sparsity of anomalies, a hyperspectral image can be e decomposed into two subspaces: a background subspace specified by a matrix with low rank dimensionality and an anomaly subspace specified by a sparse matrix with high rank dimensionality. This paper develops an approach to finding such low-high rank decomposition to identify anomaly subspace. Its idea is to formulate a convex constrained optimization problem that minimizes the nuclear norm of the background subspace and little ι1 norm of the anomaly subspace subject to a decomposition of data space into background and anomaly subspaces. By virtue of such a background-anomaly decomposition the commonly used RX detector can be implemented in the sense that anomalies can be separated in the anomaly subspace specified by a sparse matrix. Experimental results demonstrate that the background-anomaly subspace decomposition can actually improve and enhance RXD performance.
Anomaly detection generally requires real time processing to find targets on a timely basis. However, for an algorithm to be a real time processing it can only use data samples up to the sample currently being visited and no future data samples can be used for data processing. Such a property is generally called “causality”, which has unfortunately received little interest in the past. Recently, a causal anomaly detector derived from a well-known anomaly detector, called RX detector, referred to as causal RXD (C-RXD) was developed for this purpose where the sample covariance matrix, K used in RXD was replaced by the sample correlation matrix, R(n) which can be updated up to the currently being visited data sample, rn. However, such proposed C-RXD is not a real processing algorithm since the inverse of the matrix R(n), R-1(n) is recalculated by entire data samples up to rn. In order to implement C-RXD the matrix R(n) must be
carried out in such a fashion that the matrix R-1(n) can be updated only through previously calculated R-1(n-1) as well as the currently being processed data sample rn. This paper develops a real time processing of CRXD, called real time causal anomaly detector (RT-C-RXD) which is derived from the concept of Kalman filtering via a causal update equation using only innovations information provided by the pixel currently being processed without re-processing previous pixels.