Optical image processing  can be considered an area with an important technological potential, especially when combined with increasingly better optoelectronics devices, such as spatial light modulators or CCD/CMOS cameras. From the educational point of view it is an effective method of introducing the concept of parallelism in computing, inherent in optics. Furthermore it is an efficient and elegant method to present the linear systems theory , common to a large number of science and engineering fields, since the lab experiments provide a direct visualization of linearity, superposition, space-shift invariance, direct space domain and frequency (Fourier) domain.
The usual scheme used in optical image processing takes advantage of direct manipulation in the Fourier plane of the frequency content of the object. In the Fourier plane we place a filter, which multiplies the Fourier transform of the object point by point. This would be made in the Vander Lugt type correlators. A variation of this scheme is used in the Joint-transform correlator but we do still visualize a Fourier plane. A different strategy was proposed at the end of the 70s, dedicated to modify the angular plane waves spectrum of the object with no need for a Fourier plane . This image processing strategy, so-called Bragg processing, takes advantage of the characteristics of the angular response in volume holography [4,5]. In the next Section we show two different setups and the possibilities they offer to demonstrate in a students lab optical image processing with and without a Fourier plane.
Experimental realization of the two approaches
In Fig. 1(a) we show a modified version of a Vander Lugt converging correlator, as proposed in Ref. , with two arms that allow the display of the Fourier plane and the final plane simultaneously. This can be done because we insert a beam-splitter in the light trajectory. By means of the lens L3 we project the Fourier plane onto the screen or onto a CCD camera. Lens L2 projects the Fourier Transform of the Fourier plane onto the CCD camera. In the Fourier plane we put the corresponding frequency filter H(u) multiplying the frequency spectrum F(u) of the object. Thus the frequency content G(u) of the final image is given by G(u) = F(u)H(u). In this type of setups we are able to directly specify in a certain plane the transfer function H(u) of the system.
In Fig. 1(b) we show a one lens imaging setup where we have introduced a volume grating. (a) volume grating generates a transmitte (b) a diffracted order and we obtain two different images at the output plane. Spatial filtering of the object frequency content is accomplished without a physical Fourier plane but through the modification of the angular plane waves spectrum of the object by the volume grating. As given by the Kogelnik's coupled wave theory  the diffraction efficiency of a volume grating varies with the angle of incidence. The efficiency as a function of the angle will act as the transfer function of this system.
Let us consider the angles θo and ψ, with respect to the optical axis in the optical system (Fig. 1(b)), for the plane waves spectrum of an input object and for the orientation of the grating. With a proper substitution  we can rewrite the angular response of the grating as frequency transfer functions for the transmitted H0(u) and the diffracted order H1(u), where u = sine θo/λ0. Thus, the frequency contents of the transmitted G0(u) and the diffracted images G1(u) are given by: G0(u) = F(u)H0(u) and G1(u) = F(u)H1(u) respectively. We note that these filtering operations are not dependent on the distance between the object and the grating. Therefore we can construct very compact processors where the grating is in close contact with the object. The function of the lens is simply to image the filtered object onto a final plane. An interesting situation arises when the grating exhibits maximum diffraction efficiency and it is oriented at the Bragg angle, Ψ=θBragg. Then the DC component of the object is addressed in the direction of the diffracted order and H0(u) and H1(u) correspond respectively to a high-pass and a low-pass filter .
Let us show in Fig. 2 and 3 some experimental images obtained respectively with the setups in Fig. 1(a) and 1(b). With the double arm correlator we see in Fig. 2(a) the input object, composed by several gratings with the same period but different orientation. In Fig. 2(b) we show its optical Fourier transform obtained at the Fourier plane. We see the first harmonics of the scene forming a circle around the zero frequency. The second harmonics are also slightly visible. In Fig. 2(c) we show the resulting image when applying a filter which blocks the harmonics in some specific orientations. We clearly distinguish the gratings that are removed by this filtering operation. Finally, in Fig. 3 (setup in Fig. 1(b)) we show the transmitted image by a volume phase grating exhibiting maximum diffraction efficiency and oriented at the Bragg angle Ψ=θBragg. We see that we obtain edge enhancement in the vertical edges.
We have proposed the use of two different setups to show the students in optical image processing two complementary approaches to modify the frequency content of an object, with and without a Fourier plane, i.e. two different ways to synthesize the transfer function of a system. The understanding of the two experiments helps for a deeper understanding of optical image processing, its linear systems mathematical background, and some aspects of volume holography.
This work was supported by Generalitat Valenciana, Spain (projects GV01-130, GV04A/574 and GV04A/565).