Many applications require a knowledge of the two-dimensional ground spot profile or its Fourier transform, the transfer function. The two-dimensional ground spot profile is needed to analyze image data, to sharpen images and to perform data fusion. In the field of imaging spectrometry, the ground spot profiles in each spectral band are needed to analyze the results of spectral unmixing. The two-dimensional ground spot profile and transfer function carry more information about an imaging system than the modulation transfer function (MTF) in two orthogonal directions, for instance, the crosstrack and downtrack directions of a pushbroom or whiskbroom imaging system. This fact is apparent from the Nyquist sampling theorem and the projection-slice theorem (Mersereau and Oppenheim, 1974) which imply that N projections (in this instance, line spread functions) at angular increments of yr/N are needed to reconstruct a two-dimensional function over an N by N grid. Therefore, unless the two-dimensional ground spot profile is a separable function in the crosstrack and downtrack directions, these two one-dimensional transfer functions carry less information than the two-dimensional ground spot profile or two-dimensional transfer function. A method of estimating the two-dimensional ground spot profile from flight data is desired because the flight performance of a sensor is difficult to assess from laboratory measurements. A zeroth order approximation to the ground spot profile can be obtained from images of the same scene collected at sufficiently different spatial scales. The idea is to use the image of the scene collected at low altitude as a reference image, and to determine the blurring of this reference needed to match the high altitude image. The resultant blur function provides only a zeroth order approximation to the ground spot profile because the low altitude image is itself blurred. If certain conditions are satisfIed by the data, this zeroth order approximation can be improved by correcting for the use of the low altitude image as the reference image. The ground spot profile is defined by the the angular distribution of radiation collected by a single detector and the blurring from downtrack motion during the integration period. The ground spot profile is a function of the crosstrack and downtrack coordinates c and d, the sensor height h above the ground, and the distance in the sensor moves along the downtrack direction during the detector integration period. Using v to denote the sensor surface velocity along the downtrack direction and t to denote the detector integration time, in = The contribution to the ground spot profile that originates from angular spreading depends on the geometric parameters of the collection and is represented by g(c/h, d/h). The crosstrack and downtrack coordinates are divided by h because the width of g( )should scale linearly with the height of the sensor from the ground if the sensor is a pushbroom imager with field-of-view centered at the nadir. The list of contributions to g( )includes the detector element dimensions, diffraction, optical aberrations and high frequency pointing jitter. The contribution from downtrack motion does not scale with height and is represented by a rect function. rect(d/m). The rect(x) function equals 1 for IxI< 1/2, and 0 otherwise. The combined spreading from the angular spread and downtrack motion is expressed as a convolution: p(c, d, h, m) = Jg(c/h, v/h)rect((d —v)/m)dv (1) where p( ) denotes the ground spot profile. If the sensor is operated too close to the ground, defocusing will change the shape of the ground spot profile, and our assumption of linear scaling with h would be invalid. If the image data is corrected to reflectance (Section 2.1), the reflectance samples in a given spectral band are related to the unknown scene reflectance by a convolution with the ground spot profile: r(c, d, h, m) =ffu(, )p(Cj —, d — v,h, m)ddv + noise(c, di). (2) The scene is denoted by u( ) to emphasize that the scene reflectance is unknown, except for the reflectance samples r(c ,d, , h, in) derived from the sensor data. The spatially discrete sample coordinates c and d are defined in terms of the sample intervals Lcand d in the crosstrack and downtrack directions: c =izc and d =jzd, with i and j running over the pixel counts in the crosstrack and downtrack directions. Our goal is to estimate p(c, d, h, ni) from an extensive set of reflectance estimates r(c ,d, h, m) taken at heights h1 and h2, with h1 < h2. The data consists of HYDICE collections over mowed fields and old growth woods from altitudes of 1,500 and 6,000 m. The basic idea outlined above for obtaining a zeroth order approximation of the ground spot profile can be carried out three different ways; one approach is iterative, the other two, non-iterative. The iterative approach starts with a trial ground spot profile and uses it to blur the imagery collected at low altitude. The blurred imagery is compared to the imagery collected at the higher altitude. Different trial ground spot profiles are tried, and the ground spot profile is estimated by the trial profile that achieves the closest match between the low and high altitude images. The direct approaches apply either in the spatial domain or in the Fourier domain. The spatial domain calculation uses least-squares to estimate parameters of the ground spot profile. The Fourier domain approach estimates the Fourier transform of the ground spot profile as the ratio of the Fourier transform of the high altitude image to the Fourier transform of the low altitude image. These approaches to estimating the ground sample profile of HYDICE do not use overpasses of test targets or linear features such as bridges, and oniy require repeated collections of a scene at low and high altitude. On the other hand, the analysis of aerial imagery to find the two-dimensional ground spot profile involves solving new data processing problems that are not encountered in MTF estimates from images of high contrast linear features such as bridges. For example, it is necessary to cope with any geometric distortions in the imagery. HYDICE imagery contains geometric distortions from several sources: 1 . pointing instability of the stabilized platform, 2. wander of the airplane ground track, 3. imperfect alignment of the pushbroom with the ground track, 4. curvature of the spectrometer slit as projected on the earth surface (the spectrometer slit is slightly curved to reduce an optical aberration in the spectrometer known as "smile."), and 5. different perspectives of the surface relief at the two different heights above the surface. To cope with these distortions, the imagery cannot be treated as a monolithic array, but must be processed in small blocks that are nearly distortion free. The residual distortion internal to these image blocks will introduce errors, but unless there is a systematic bias in the residual distortion, the errors will average out. The HYDICE ground spot profile is a slowly varying function of the field angle and the spectral channel. The dependence on field angle is preserved by processing the images in small blocks, and the dependence on spectral channel is preserved by processing each spectral channel separately.
An optoelectronic neural network based upon the Neocognitron paradigm has been implemented at JPL and successfully demonstrated for automatic target recognition for both focal plane array imageries and range-Doppler radar signatures. A novel feature of this neural network architectural design is the use of a shift-invariant multichannel Fourier optical correlation as a building block for iterative multilayer processing. An innovative bipolar neural weights holographic synthesis technique was utilized to implement both the excitatory and inhibitory neural functions and dramatically increase its discrimination capability. In order to further increase the optoelectronic Neocognitron's self-organization processing ability, a wavelet preprocessor has been developed for feature extraction preprocessing (orientation, size, location, etc.). The addition of this wavelet processor would enable the neocognitron to dynamically focus on the incoming targets based on their known features and result in higher discrimination and lower false alarm rate. The theoretical analysis of an orientation and scale selective wavelet is provided. A multichannel optoelectronic wavelet processor using an e- beam complex-valued wavelet filter is also presented. Experimental demonstrations of wavelet preprocessing for feature extraction are also provided.
An optical neural network based upon the Neocognitron paradigm is introduced. A novel aspect of the architectural design is shift-invariant multichannel Fourier optical correlation within each processing layer. An innovative bipolar neural weights holographic synthesis technique is introduced to implement both the excitatory and inhibitory neural functions. Multilayer processing is achieved by iteratively feeding back the output of the feature correlator to the input spatial light modulator and updating the Fourier filters. By designing the neural net with characteristic features extracted from the target images, successful pattern recognition with intra-class fault tolerance and inter-class discrimination is achieved. A detailed system description is provided. Experimental demonstrations of a two-layer neural network for space objects discrimination is also presented.
A feature-extraction-based optoelectronic neural network is introduced. The system implementation approach applies the principle of the neocognitron paradigm first introduced by Fukushima et al. (1983). A multichannel correlator is used as a building block of a generic single layer of the neocognitron for shift-invariant feature correlation. Multilayer processing is achieved by iteratively feeding back the output of the feature correlator to the input spatial light modulator. Successful pattern recognition with intraclass fault tolerance and interclass discrimination is achieved using this optoelectronic neocognitron. Detailed system analysis is described. Experimental demonstration of radar signature processing is also provided.