Recently, a new image compression algorithm was developed which employs wavelet transform and a simple binary linear quantization scheme with an embedded coding technique to perform data compaction. This new family of coder, Embedded Zerotree Wavelet (EZW), provides a better compression performance than the current JPEG coding standard for low bit rates. Since EZW coding algorithm emerged, all of the published coding results related to this coding technique are on monochrome images. In this paper the author has enhanced the original coding algorithm to yield a better compression ratio, and has extended the wavelet-based zerotree coding to color images. Color imagery is often represented by several components, such as RGB, in which each component is generally processed separately. With color coding, each component could be compressed individually in the same manner as a monochrome image, therefore requiring a threefold increase in processing time. Most image coding standards employ de-correlated components, such as YIQ or Y, C<SUB>B</SUB>, C<SUB>R</SUB> and subsampling of the 'chroma' components, such coding technique is employed here. Results of the coding, including reconstructed images and coding performance, will be presented.
In this paper, a novel approach to feature extraction for rotationally invariant object classification is proposed based directly on a discrete wavelet transformation. This form of feature extraction is equivalent to retaining information features while eliminating redundant features from images, which is a critical property when analyzing large, high dimensional images. Usually, researchers have resorted to a data pre-processing method to reduce the size of the feature space prior to classification. The proposed method employs statistical features extracted directly from the wavelet coefficients generated from a three-level subband decomposition system using a set of orthogonal and regular Quadrature Mirror Filters. This algorithm has two desirable properties: (1) It reduces the number of dimensions of the feature space necessary to achieve the same classification accuracy as in the original space for a given pattern recognition problem; (2) Regardless of the target orientation, the algorithm can perform classification with low error rates. Furthermore, the filters used have performed well in the image compression regime, but they have not been applied to applications in target classification which will be demonstrated in this paper. The results of several classification experiments on variously oriented samples of the visible wavelength targets will be presented.
This paper presents a wavelet based image coding method achieving high levels of compression. A multi-resolution subband decomposition system is constructed using Quadrature Mirror Filters. Symmetric extension and windowing of the multi-scaled subbands are incorporated to minimize the boundary effects. Next, the Embedded Zerotree Wavelet coding algorithm is used for data compression method. Elimination of the isolated zero symbol, for certain subbands, leads to an improved EZW algorithm. Further compression is obtained with an adaptive arithmetic coder. We achieve a PSNR of 26.91 dB at a bit rate of 0.018, 35.59 dB at a bit rate of 0.149, and 43.05 dB at 0.892 bits/pixel for the aerospace image, Refuel.
Simulation methods offer a time- and cost-effective approach to the evaluation and testing of guided missiles. These include hardware-in-the-loop as well as all-digital simulations which provide information about how a particular existing or proposed missile system might perform in hypothetical situations which may not be practically duplicated in reality. This paper describes an all-digital simulation developed using available components. The paper describes the functional flow of the simulation, and identifies the information which is passed between the independent modules. Several applications are described, and results such as intercept miss distances, line-of-sight pointing error statistics, and sensitivity to certain system parameters are demonstrated.
Hardware-in-the-loop (HWIL) testing can be used as an efficient and effective means for analyzing the performance of guided missile systems. Due to the limits of current technologies, components of the simulation are limited in their capability to simulate real-world conditions for certain test articles. One component which is critical in an HWIL system for strategic guided missiles is the scene projection or delivery device. To stimulate imaging JR sensors, this scene projector (SP) typically consists of a pixelized in-band source which can be modulated both spatially and temporally to simulate the radiane scene which would be observed during an actual engagement. The SP is driven by a scene generator which provides scene radiance information to the SP under control of a simulation computer, which determines the field-of-view (FOV) composition based on a simulated engagement. In using such a system, a primary concern is that the SP is able to create a scene which produces the proper response in the observing sensor. Another effect which bears examination is the SFs projection method, such as scanning an in-band source to cover the projection FOV. The detailed interaction between the modulated source and the timing of the sensor's detection, integration, and readout processes may cause unrealistic or unexpected sensor behavior. In order to assess the compatibility of a specific sensor viewing a specific SP, a detailed simulation has been developed by Nichols Research Corporation under the direction of the Guided Interceptor Technology Branch (WL/MNSI) of the USAF Wright Laboratory Armament Directorate. This simulation was designed primarily to address issues related to scene projector usage in the Kinetic Kill Vehicle Hardware in the Loop Simulator (KHILS) facility at Eglin AFB, Florida. The simulation allows the user to define: the spatial response of the sensor; the spatial properties of the SP (i.e. the radiance distribution arising from a commanded impulse); the illumination timing of the SP, such as scan format, persistence, etc.; and the integration and readout timing of the sensor. Given sampled values of these response functions, and sampled values of the desired radiance scene, the SP simulation computes the detector outputs in the form of a sensed image. This output image can help to assess the suitability of using the modeled SP for testing the modeled sensor by illustrating potential mismatches. It also provides a means to predict the performance to be expected from this module of the HWIL simulation for a particular test scenario. This paper derives equations which express the sensor output as a function of the input scene, the spatial and temporal response functions of the sensor and the SP, and the spectral response functions of the sensor and SP. Assumptions which affect the implementation and the generality of application are stated and discussed. Results and conclusions are presented for a specific application which illustrate the utility of the simulation
This paper describes the capabilities, models, and implementation of the SSW sensor model software, and illustrates its utility in processing computer-generated signatures. Sample images illustrate the results of processing computed images with different components of the 55W sensor model. Synthetic scene modeling and signature generation have become important tools used in the development of complex sensor systems for smart weapons. Simulated signatures have proven useful by providing realistic data to support system development, performance prediction, validation, and trade studies for signal processing applications and entire systems. In addition, comparisons between computed signatures and measured data can provide insight into signature phenomenology and modeling. Standard signature prediction codes do not account for effects caused by the sensor. These effects can cause measured signatures to differ significantly from predictions, and may critically affect the performance of applications which use the data. The Strategic Scene Workstation (55W), developed by Nichols Research Corporation for the USAF Wright Laboratory, Armament Directorate, includes a computer model designed to simulate sensor effects in computed signatures. The 55W sensor model simulates spatial effects, noise, and detector characteristics typical if passive sensors used in strategic applications. This function is necessary to custon ze predicted signatures, and has been used effectively to enhance the realism and accuracy of simulated signatures for applications including hardware in the loop simulation at the USAF KHILS facility at Eglin AFB, FL.