This paper presents an alternative, computational intelligence based paradigm for biological attack detection. Conventional approaches to this difficult problem include sensor technologies and analytical modeling approaches. However, the processes that constitute the environmental background as well as those which occur as the result of an attack are extremely complex. This phenomenological complexity, in terms of both physics and biology aspects, is a challenge difficult to overcome by conventional approaches. In contrast to such approaches, the proposed approach is centered on automatic learning to discriminate between sensor signals that are in a normal range from those that are likely to represent a biological attack. It is argued that constructing machine learning methods robust enough to perform such a task is often more feasible than constructing an adequate model that could form a basis for bioattack detection. The paper discusses machine learning and multisensor information fusion methods in the context of biological attack detection in a subway environment, including recognition architecture and its components. However, the applicability of the proposed approach is much broader than the subway bioattack protection case, extending to a wide range of CBR defense applications.
Hyperspectral imaging (HSI) sensors provide imagery with hundreds of spectral bands, typically covering VNIR and/or SWIR wavelengths.
This high spectral resolution aids applications such as terrain classification and material identification, but it can also produce imagery that occupies well over 100 MB, which creates problems for
storage and transmission. This paper investigates the effects of lossy compression on a representative HSI cube, with background classification serving as an example application. The compression scheme first performs principal components analysis spectrally, then discards many of the lower-importance principal-component (PC) images, and then applies JPEG2000 spatial compression to each of the individual retained PC images. The assessment of compression effects considers both general-purpose distortion measures, such as root mean square difference, and statistical tests for deciding whether compression causes significant degradations in classification. Experimental results demonstrate the effectiveness of proper PC-image rate allocation, which enabled compression at ratios of 100-340 without producing significant classification differences. Results also indicate that distortion might serve as a predictor of compression-induced changes in application performance.