Translator Disclaimer
6 October 2003 Hands-on demonstrations and teaching tools for optics and adaptive optics
Author Affiliations +
Proceedings Volume 9663, Eighth International Topical Meeting on Education and Training in Optics and Photonics; 96631W (2003)
Event: Eighth International Topical Meeting on Education and Training in Optics and Photonics, 2003, Tucson, Arizona, United States
A set of demonstrations suitable for use in classrooms at the secondary, undergraduate and graduate level, and for public use in museums and science centers were developed to exhibit basic optics principles, vision science, and adaptive optics techniques. This paper will exhibit these demonstrations, and how they can be used to promote understanding of optics principles in a wide range of applicable areas. Demonstrations include units showing image formation, orientation, and scale; a unit showing 3-dimensional ray tracing through optical systems using a scattering medium and laser diodes; a unit allowing users to directly observe the color sensitivity of their eyes; and a demonstration directly demonstrating wavefront errors, image distortion, and a Shack-Hartmann sensor for wavefront measurement. Together, these bring real-world optical principles into the hands-on regime, and demystify optics principles that are difficult to visualize in three dimensions. These demonstrations are currently in use at the Keck Observatory in Hawaii, at Nauticus in Virginia, at the Yerkes Observatory and at Carthage College in Wisconsin, as well as at several middle and high schools in Illinois and Wisconsin. They have also been integrated into a unit in the Hands on Universe program published by the Lawrence Hall of Science. This work was supported by the National Science Foundation through the Center for Adaptive Optics.



Optics education suffers from a lack of demonstration equipment that clearly and simply demonstrates basic optics principles. Standards such as optical rails with lenses of different characteristics, two-dimensional ray tracing models, and optics software have their place in the curriculum, but none of them completely demonstrates optical effects in a way which makes full three-dimensional phenomena clearly visible. To remedy this situation, a set of demonstrations was developed, constructed, and tested. The reader is directed to for a detailed description of each of these systems.


The F-Box

Optical benches with lenses and projection screens have been used for many years to demonstrate basic image characteristics. Rather than being constrained to on-axis situations, and having only a single lens or optical train to study at a time, a setup was constructed that allows simultaneous comparison of images from several lenses (see Figure 1). The setup is referred to as an ‘F-box’, because the letter ‘F’ is used on a light box as the image source. This letter is advantageous because it possesses no symmetries; inversion or reversal is easy to detect. A board with three lenses of identical diameter (typically 5-10 cm) and disparate focal lengths (typically 50, 100, and 200 cm), cast images onto a set of projection screens, each covered with rectilinear graph papers of different line densities. It is easy for students, by careful measurement of focal length and image size, to see with this setup that the image scale is proportional to focal length; by applying masks of different aperture sizes students discover that image scale depends only on focal length, and not on aperture. Indeed, aperture stops of different sizes, shapes, and locations on each lens show that each point on the lens creates a complete image. Brightness comparisons can be made as well. It is easy to extend these experiments, and, in an astronomy setting, for example, to show how one can under- or over-sample an image by using sheets of graph paper on the projection screens with cells of different sizes. Students can color-in the squares overlapping each image to see the effects of pixel size on image quality. The fastest lens in the array typically exhibits significant spherical aberration, which can be demonstrated by masking the lens into different zones and noting the best focal position for each. The lens array can also be tilted, and off-axis aberrations such as coma and astigmatism can be demonstrated. The F-box has been found to be the most useful demonstration in outreach activities, such as those conducted at Yerkes Observatory by the Hands-on-Universe program and at Nauticus in Norfolk, VA. Compact, relatively easy to construct, and very easy to operate, it makes side-by-side comparisons of different optical effects easy to observe. A reflective version could be built, as well, though the need to offset the mirrors to throw images onto suitable screens, and the substantial cost of first-surface paraboloids, casts favor in the direction of the refractive version shown here.

Fig. 1.

The basic F-Box setup, showing the light box, lens array, and projection screens.



Vision Box

Light is defined as that range of wavelengths (or frequencies) of electromagnetic radiation to which the eye is sensitive. The canonical image shown in a textbook on light or optics suggests that the sensitivity of the eye is described by a bell-shaped spectral distribution. However, the spectral response of the eye is actually determined by the overlapping response functions of red, blue, and green-sensitive cones in the retina. The spectral response of a real eye is more complex than that suggested by textbook cartoons. Further, each individual has a slightly different spectral response. Most extremely, those who are color blind, of course, have radically different color sensitivities from those of a color-visioned individual. Thus, it is interesting to provide users a chance to actually observe their own color sensitivity. This demonstration uses a simple slit and diffraction grating spectrograph, using inexpensive sheet-acetate grating material, to create a spectrum that is easy for a user to see. The light source is a halogen lamp; while, admittedly, a halogen lamp doesn’t provide a completely flat spectrum, its variation is far less than a simple incandescent source, and, of course, doesn’t exhibit the lines that a mercury vapor lamp shows. The apparatus has two slits, which can be used interchangeably. The first slit is of constant width, creating a light source of equal intensity along its length. A user employing this slit sees a spectrum ranging from blue to red, of constant height. The second slit has a width that becomes progressively narrower towards the top. The slit is shaped to provide a width that decreases logarithmically, rather than linearly, as this gives a better map onto the eye’s overall logarithmic response to light. On top of this slit is placed a gradient-density filter, which also various logarithmically in density with length. Together, these generate a slit source which is brightest at the bottom, and which dims with length as one proceeds towards the top. With this light spread out by the diffraction grating, the apparent height of the spectrum at each wavelength is proportional to the (logarithmic) sensitivity of the viewer. Each user therefore actually sees, effectively, a plot of his or her own color sensitivity. It isn’t possible, of course, to record this with a camera; it is the viewer who creates the viewed response. However, a camera, positioned at the viewer’s position, records an image that is itself a measure of the color sensitivity of the detector. Shown in Figure 2 is the response of a Nikon 990 digital camera. It is clear from the lower image that there are pixels that are broadly sensitive to each of red, green, and blue. It would be equally easy to measure/exhibit the response of a CCD array, or to exhibit the sensitivity of different color emulsions.

Figure 2.

Color sensitivity recorded by a Nikon 990 digital camera through the Vision Box demonstration. Top image: using a linear slit; bottom image: using the graded slit. Note the responses of the three pixel types in the camera.



Ray Tracing in Three Dimensions

Optics books and those used in astronomy and vision science have diagrams that purport to show the path light rays take through optical trains. These are difficult for the novice to interpret in three dimensions. This is especially true if one is studying aberrated systems, or when the optical path is folded, both of which are common situations of interest. Optical benches are sometimes used to show light paths through systems, with screens set up in various places to sample the beam shape, but the actual light path is difficult to determine. The setup developed here lets a user view the paths of light rays through any system, walk around the system to interrogate the light path from any angle, and immediately see the changes that result from different mirrors, lenses, and positions and tilts of optical components. There are two keys to this setup: a suitable light source array, and a suitable scattering medium. The former is achieved using a ring of laser diodes (such as those manufactured for laser pointers). These must be collimated to provide a set of parallel beams. In our setups we have built rings of six diodes, and dual rings having sets of six diodes on two different diameters. The scattering medium found to work best consists of a dilution of liquid hand soap with a ratio of 1 part soap to 7 of de-ionized water providing the longest useful light path, with sufficient side scatter to make the beams easily visible. Figure 3 shows a representative example of an optical train, in this case a 4-inch Newtonian reflecting telescope, as demonstrated with this setup. It is easy to see the converging beams from the parabolic primary mirror, the turning of the beam by the diagonal mirror, and the formation of the focal plane. An additional lens can be added to be an eyepiece, rendering the light pencil parallel once again. Pictures such as Figure 3 are often created with time exposures; this one picture, however, was taken with a simple digital camera, and shows the system as it would appear to the naked eye. The eye, however, has even better dynamic range, and the system viewed by eye produces an even better, easier to see, set of rays.

Figure 3.

Top: Light path of a Newtonian reflecting telescope, as shown using the Ray Tracing demonstration system. Note that this is not a time exposure; the naked-eye image looks just like (actually better!) than this. Bottom: The setup used to produce the upper image. Note the primary and secondary mirrors, the ‘eyepiece’, and the laser diode array to the right.


This setup is also useful for demonstrating or interrogating any optical train. With a suitable tank, any collection of optics can be used, and with a suitable pattern in the diode array any geometrical aspect of the optical train can be seen. For example, it is easy to show the effects of using a paraboloid (or other conic section) off-axis; the rays no longer meet at a single point, and demonstrate coma and astigmatism. Spherical aberration can be shown using multiple, concentric rings of diodes. Using multiple elements, one could, for example, show the aberration created by an elliptical primary mirror, and she how it is corrected by a spherical convex secondary (in a Dall-Kirkham arrangement), or with a pair of hyperboloids (as in a Ritchey-Chretien).


The Shack-Hartmann Sensor Demonstration

One of the most exciting advances in optics is the development of adaptive systems that reduce or virtually eliminate the effects of turbulent media in the light path. This has proven especially valuable in eye and astronomy research. The heart of these systems comprises two major components: a flexible mirror that corrects wavefront errors, and, usually, a Shack-Hartmann (S-H) sensor to detect the wavefront errors. While the former is relatively easy to demonstrate (e.g., using a sheet of aluminized mylar), the latter is more difficult. In essence, a S-H sensor works by breaking a wavefront into segments, with each segment small enough to be locally flat but tilted. S-H sensors use arrays of extremely small lenses to sample each such segment, and project a set of images, one from each lens, onto a detector that, at each moment in time, notes the displaced position of each small image. Thus, the overall wavefront error as a function of position can be quantified by measuring the lateral displacement of the image created by each of the S-H lenses. It’s a fundamentally beautiful technique, but showing it in real time with an actual adaptive optics system requires not only extremely expensive equipment, but is also ineffective as one can’t actually observe the displacement of the images on the detector, which are too small and are far too rapid to see. The system shown in Figure 4 was developed to specifically demonstrate the operation of the S-H system. A light source providing a simple point of light is placed at a far distance (say, 10 meters away). The first surface mirror shown in the figure is used to fold the light path down into the demonstration, though a light source held vertically over the system would work as well, obviating the mirror. A simple plano-convex lens focuses the light from the source onto a screen at the bottom of the device. The beam is split by a beamsplitter, with one-half of the light being reflected towards a second plano-convex lens that re-collimates the beam. The beam then enters an array of sixteen 2 cm diameter lenses of approximately 15 cm focal length. These project separate images onto a ground glass screen. In operation, a transparent tray of either water or, preferably, mineral oil is held over the first lens, creating slow-moving turbulence similar to that created by the atmosphere over a telescope. When moved, the tray creates enough turbulence that the image projected at the bottom of the apparatus appears totally distorted, while the individual images projected on the ground-glass screen by the lens array are neat little spots, each wandering in the direction corresponding to the local wavefront tilt. A discerning user would notice that the slopes of the waves in the liquid in the tray create tilts directly correlated with the image displacements on the ground-glass screen.

Figure 4.

Left: The apparatus to demonstrate a Shack-Hartmann Wavefront sensor. The upper mirror is used merely to reflect a horizontal lightpath into the system. Right: The lens array which forms the heart of the sensor. Each samples a portion of the wavefront, projecting a displaced image on the viewing screen.




Among the projects currently under way is the development of a functional eye model, using the dilute soap scattering solution as a vitreous humor replacement. A flexible lens will allow one to observe the ability of the eye to focus, and the introduction of various corneal shapes can show the effects of astigmatism and long and short sightedness. With the use of additional lens elements, the application of ‘corrective lenses’ to cure visual aberrations can easily be shown. For an advanced ray tracing system, the same scattering liquid could be placed in small containers, and used as interrogators to make visible beams running through a system which is bolted to an optical bench. The S-H system could be designed around an actual telescope as the primary imager. There are many ways in which these demonstrations can be improved or modified to suit different uses and audiences.



The demonstrations shown here bring optical principles to light for audiences ranging from the general public in museums and science centers, to undergraduates in optics, astronomy, or biology courses covering vision. They can be used at the graduate level to make advanced optical principles clear, or in research environments where they could be used as diagnostic tools. Each is easy to construct, and is a valuable addition to the curricular tools at institutions from high schools through colleges, universities, and graduate schools. The authors would be pleased to provide additional design details, and we would be pleased to receive suggestions for improvements or ideas for other demonstrations that would make optical principles understandable.

© (2003) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Douglas N. Arion, Kevin M. Crosby, Daniel Lyons, Kendra Rand, and Ann Randolph "Hands-on demonstrations and teaching tools for optics and adaptive optics", Proc. SPIE 9663, Eighth International Topical Meeting on Education and Training in Optics and Photonics, 96631W (6 October 2003);


Back to Top