Some of the greatest mathematicians and scientists in history have made their most important contributions by applying unsystematically one of three general patterns or templates for inventions. Here, for the first time to my knowledge, these templates are stated explicitly and illustrated with examples from optics. I call them The Do- Nothing Machine, The Continuous Extension, The Up-Down Paradigm, and The Reversal of Fortune.
The McCutchen transform is the combination of a Fourier transform with a square-root geometrical distortion. For a type of rotational-deblurring problem, it would be desirable to extend this transform in a 'natural' fashion. This talk summarizes some of the rewards and many of the frustrations arising from attempting this extension.
Nonlinear features that represent higher-order correlations in input data are considered for improved recognition. They optimize new performance measures that do not make Gaussian etc. data distribution assumptions and that are intended for improved discrimination. The new features are produced in closed-form and are thus preferable to iterative solutions. An efficient two-step feature extraction algorithm is presented for the high-dimensional (iconic) input data case of most interest. The feature generation can be realized as a new neural network with adaptive activation functions. Test results on pose-invariant face recognition are emphasized; results on standard feature inputs for a product inspection application are briefly noted as a low- dimensional input data case.
We describe a number of methods for imaging into and through highly scattering media, all based in optical imaging processing methods. We describe methods that describe image formation in scattering media in new ways that complement transport theory and other traditional ways.
A method of wavefronts' cross-correlation by means of Second Harmonic Generated Hologram (SHG hologram) is considered. According to this method, the interference pattern of an object and reference waves is recorded in a nonlinear light-sensitive material using its second order nonlinearity. The SHG hologram generates a wave that forms the reconstructed image of the object, the frequency of the reconstructed wave being doubled. An expression that describes the electrical field of the reconstructed wave is deduced. It is suggested to use the transforming properties of the SHG hologram for constructing the network of changeable interconnection lines which operates on the principle 'light is controlled by light'. The experiment has confirmed the ability of the SHG hologram of forming high quality images of arbitrary objects. The ways of overcoming the effect of doubling the frequency of the light after each act of a signal transformation are considered. The theory has shown that by using the effect of 'down-conversion' it is possible either to return the frequency of the signal to its initial value or to sustain the value of the frequency at the constant level.
The Fourier family comprises a wide variety of mathematical transforms, some of them well established in the image-science community, some lesser known but deserving of more recognition. The goal of this paper is to survey the genealogy of this family and to show some possibly non-obvious applications of each member. Three central premises run through the discussion: (1) There can be no science of imaging without a scientific approach to the evaluation of image quality; (2) Image quality must be defined in terms of the information that is desired from the image and the method of extracting that information; (3) Digital images are discrete data obtained from a continuous object. These considerations will lead us to rely on rather different members of the Fourier family than the ones most often encountered in polite imaging society.
Optical signal processing has its roots in the experiments of Lord Rayleigh, Abbe and Porter that were the first that dealt with the spectrum of an image. This historical path of revolution has been followed years later by the extraordinary work of A. W. Lohmann in the optical data processing during the last 40 years. The new innovations and the future possibilities that are to be opened up in the new millennium show his dominant signature. Some recent projects that enlighten the future ofthe optical signal processing field are described in this presentation. The invention of the computer generated holograms (CGH) was a giant leap for the optical signal processing field. Filters and holograms that previously were generated by direct holographic recording means, have been all of a sudden replaced by synthetic functions designed and realized by digital computers. It was the first interface between digital computers and optical systems. Such an approach led to the design of opto-electronic systems that operate in perfect synergy where each element is utilized for what it does best. Along the years, the field of computer-generated holography has been expanded to what is known as: diffractive optical elements. Techniques like kinoforms, binary optics, on-axis CGH, etc. have been developed for addressing the growing application list ofsuch elements. CGHs highly affected the optical signal-processing field. For example, various new processing techniques were created and applied for invariant pattern recognition (Circular Harmonics (CH), Synthetic Discriminate Filters (SDF) etc). The next innovation wave reached the shores of the optical data processing community in the 80s when A. W. Lohmann presented the optical interconnections as the next challenge of optical data processing. Many configurations were discussed, investigated and applied for optical processing (perfect shuffle, omega net, cross over etc.). In the 90s A. W. Lohmann was a key player in a new revolution in optical processing where optics was used as a transformation tool. New transformations were invented and realized by optical means as for instance Fractional Fourier Transform, Wigner distribution, Fractional Hubert and Hartley Transforms etc. Those were applied for various signal-processing applications and used also in digital processing. In the new millenium optics adapts itself to the binary mode of operation that is common in computer systems. This trend becomes feasible also due to the impressive progress in the opto-electrical interface devices such as the spatial light modulators, light sources such as VCSELs and detectors such as photo-diodes. These new achievements permit also the operation of opto-electronic systems at extremely high rates. It is evident that in the next years of the millenium optical data processing field will continue to grow, develop and replace additional processing modules in the digital computation world. Without much doubt those will be accompanied and innovated by the scientific assistance foundation and guidance of A. W. Lohmann. In this paper we will focus on optical processing of partially coherent light. This field is mostly interesting and relevant since it includes both the aspects of data processing and the optical design skills that insure its promising industrial future.
Gabor's signal expansion and the Gabor transform are formulated on a non-orthogonal time-frequency lattice instead of on the traditional rectangular lattice. The reason for doing so is that a non-orthogonal sampling geometry might be better adapted to the form of the window functions (in the time-frequency domain) than an orthogonal one: the set of shifted and modulated versions of the usual Gaussian synthesis window, for instance, corresponding to circular contour lines in the time-frequency domain, can be arranged more tightly in a hexagonal geometry than in a rectangular one. Oversampling in the Gabor scheme, which is required to have mathematically more attractive properties for the analysis window, then leads to better results in combination with less oversampling. The procedure presented in this paper is based on considering the non-orthogonal lattice as a sub-lattice of a denser orthogonal lattice that is oversampled by a rational factor. In doing so, Gabor's signal expansion on a non-orthogonal lattice can be related to the expansion on an orthogonal lattice (restricting ourselves, of course, to only those sampling points that are part of the non-orthogonal sub-lattice), and all the techniques that have been derived for rectangular sampling - including an optical means of generating Gabor's expansion coefficients via the Zak transform in the case of integer oversampling - can be used, albeit in a slightly modified form.
Based on Lohmann's seminal work on gratings and zone plates, we present three nonconventional Moire techniques for signal processing. First, we describe an optical setup for implementing Erathosthenes sieve of prime numbers. Second, we discuss the use of Moire tiles for local parallel sensing. And finally, we propose a Babinet's technique for image storage. Experimental verifications are discussed.
The history of diffractive optic technology is traced from the first hand-drawn computer-generated holograms created by Adolf Lohmann and Byron Brown to elements fabricated using electron beams that are smaller than a human hair. The influence of Adolf Lohmann on the field is recounted, as are the major developments in fabrication and computation. In particular we highlight the influence of technology on design techniques showing how the availability of computer plotters in the 1960s lead to early encoding techniques. The transition in the 1970s to photolithographic fabrication changed the nature of diffractive design from cell-oriented to point-oriented encoding. At the same time optimization routines were developed that incorporated these new fabrication constraints. The introduction of electron beam writing in the fabrication of diffractive optics in the 1980s brought diffractive design in the 1990s full circle to techniques that are again cell-oriented. Shrinking-features have also changed the applications for diffractive elements. First used primarily as filters for optical correlators, diffractive elements will play a critical role in telecommunications systems that are nearing deployment. However, the most visible impact of Adolf Lohmann's contributions are pattern generators sold with laser pointers. This history is dedicated to Adolf Lohmann on the occassion of his seventy-fifth birthday.
We discuss a discrete matrix formulation of partially coherent light, in which the mutual intensity functions (autocorrelations) are represented by matrices. Based on this representation, we focus on the problem of obtaining light of a desired mutual intensity from a light source of given mutual intensity, by employing a linear optical system. This problem is quadratic. By singular value decomposition of the mutual intensity matrix, we obtain an equivalent representation which reduces this quadratic problem to a linear one. We then propose the use of fractional Fourier domain filtering circuits to efficiently implement this optical system.
The connection between the Wigner function and the generalized OTF, and between the ambiguity function and the generalized OTF is investigated for non-paraxial scalar wavefields. The treatment is based on two-dimensional (2-D) wavefields for simplicity, but can be extended to the three- dimensional case.