Quantum-dot infrared photodetectors (QDIPs) are emerging as a promising technology for midwave- and longwave-infrared remote sensing and spectral imaging. One of the key advantages that QDIPs offer is their bias-dependent spectral response, which is brought about by the asymmetric bandstructure of the dot-in-a-well (DWELL) configuration. Photocurrents of a single QDIP, driven by different operational biases can, therefore, be viewed as outputs of different bands. It has been shown that this property, combined with post-processing strategies (applied to the outputs of a single sensor operated at different biases), can be used to perform adaptive spectral tuning and matched filtering. However, unlike traditional sensors, bands of a QDIP exhibit significant spectral overlap, an attribute that calls for the development of novel methods for feature selection. Additionally, the presence of detector noise further complicates such feature selection. In this paper, the theoretical foundations for discriminant analysis, based on spectrally adaptive feature selection, are developed and applied to data obtained from QDIP sensors in the presence of noise. The approach is based on a generalized canonical-correlation-analysis framework that is used in conjunction with an optimization criterion for the selection of feature subspaces. The criterion ranks the best linear combinations of the overlapping bands, providing minimal energy norm (a generalized Euclidean norm) between the centers of classes and their respective reconstructions in the space spanned by sensor bands. Experiments using ASTER-based synthetic QDIP data are used to illustrate the performance of rock-type Bayesian classification according to the proposed feature-selection method.
Most traditional spectral sensors have spectrally adjacent bands with little overlap. This overlap is usually ignored in image processing because band-to-band correlation due to oversampling of the scene is almost always dominant. A new proposed class of adaptive spectral sensor based on bias-tunable quantum-dot infrared photodetectors (QDIPs) are different in that they have significant band-to-band overlaps. The influence of these overlaps to image processing results cannot be ignored for such sensors. To facilitate the analysis of such sensors, a generalized geometry-based model is provided here for spectral sensors with arbitrary spectral responses. It starts from the mathematical description of the interaction between sensor and the radiation from scene reaching it. In this model, the spectral responses of a sensor are used to define a sensor space. The spectral sensing process is shown to represent a projection of scene spectrum onto sensor space. The projected spectrum, which can be calculated through the output photocurrents and sensor's spectral responses, is the least-square error reconstruction of the scene spectrum. With this data interpretation, we can remove the influence of band overlap to the data. The band overlap also introduce correlation between noise of different bands, This correlation is also analyzed.
A new class of spectrally adaptive infrared detectors has been
reported recently that has a spectral response function that can be
altered electronically by controlling the bias voltage of the
photodetector. Unlike conventional sensors, these new sensors have
``bands'' that have highly correlated spectral responses. The
potential benefit of these sensors is that the number of bands (and
their spectral features) used can be adapted to a specific task. The
drawback is that there might not be enough spectral diversity to
perform detection and classification operations.
In this paper we present a new theory that describes the suitability
of an arbitrary spectral sensor to perform a specific spectral
detection/classification task. This theory is based on the
geometric relationships between the sensor space that describes the
spectral characteristics of the detector and a scene space that
contains the spectra to be observed. We adapt the theory of
canonical correlation analysis to provide a rigorous framework for
assessing the utility of spectral detectors. We also show that this
general theory encompasses traditional band selection methods, but
provides much greater flexibility and a more transparent and
intuitive explanation of the phenomenology.
Quantum-dot infrared photodetectors (QDIPs), based on intersubband transitions in nanoscale self-assembled dots, are perceived as a promising technology for mid-infrared-regime sensing since they are based on a mature GaAs technology, are sensitive to normal incidence radiation, exhibit large quantum confined stark effect that can be exploited for hyperspectral imaging, and have lower dark currents than their quantum well counterparts. High detectivity (D* = 1.0E11 cmHz1/2/W at 9 microns) QDIPs have been recently shown to exhibit broad spectral responses approximately 2-micron FWHM) with a bias-dependent shift in their peak wavelengths. This controllable, bias dependent spectral diversity, in conjunction with signal-processing strategies, allows us to extend the operation of the QDIP sensors to a new modality that enables us to achieve: (1) spectral tunability (single- or multi-color) in the range 2-12 microns in the presence of the QDIP's dark current; and (2) multispectral matched filtering in the same range. The spectral tuning is achieved by forming an optimal weighted sum of multiple photocurrent measurements, taken of the object to be probed, one for each bias in a set of prescribed operational biases. For each desired spectral response, the number and values of the prescribed biases and their associated weights are tailored so that the superposition response is as close as possible, in the mean-square-error sense, to the response of a sensor that is optically tuned to the desired spectrum. The spectral matching is achieved similarly but with a different criterion for selecting the weights and biases. They are selected, in conjunction with orthogonal-subspace-projection principles in hyperspectral classification, to nullify the interfering spectral signatures and maximize the signal-to noise ratio of the output. This, in turn, optimizes the classification of the objects according to their spectral signatures. Experimental results will be presented to demonstrate the QDIP sensor's capabilities in these new modalities. The effect of dark current noise on the spectral-tuning capability is particularly investigated. Examples of narrowband and wideband multispectral photocurrent synthesis as well as matched filtering are presented.
Spectrally tunable quantum-dot infrared photodetectors (QDIPs) can be used to approximate multiple spectral responses with the same focal-plane array. Hence, they exhibit the potential for real time adaptive detection/classification. In the present study, it is shown that we can perform the detection/classification operation at the adaptive focal-plane array (AFPA) based on QDIPs by fitting the QDIP's response to the correspondent operators. With a new understanding of spectral signature in the sensor space, the best fitting can be achieved. Our simulation results show how well QDIPs perform in different regions of the spectrum in the mid- and long wave infrared. The results indicate that the AFPA performance does not match that of the ideal filtering operators, but reliable measurement can be accomplished.