This book addresses some of the issues in visual optics with a functional analysis of ocular aberrations, especially for the purpose of vision correction. The basis is the analytical representation of ocular aberrations with a set of orthonormal polynomials, such as Zernike polynomials or the Fourier series.
Although the aim of this book is the application of wavefront optics to laser vision correction, most of the theories discussed are equally applicable to other methods of vision correction, such as contact lenses and intraocular lenses.
The traditional means of measuring visual acuity in human eyes relies on eye charts and the patient's perceptions. With the advent of wavefront-based technologies, it is now feasible to objectively determine optical resolution. This paper proposes a technique using a resolution spoke to accurately predict visual acuity based on wavefront measurements. Resolution rings are constructed using Rayleigh's criterion for the determination of optical acuity; subsequent cross correlation of the blurred resolution spoke with the un-blurred spoke is used to estimate decentration of the PSF. After laser refractive surgery, the visual acuity of 11 eyes (formerly myopic) was estimated using this technique. The predicted visual acuity was compared to the corresponding subjective measurements using 100% contrast. The correlation variance between predicted and measured acuity was about 74%, which shows that the optical acuity of human eyes can be measured objectively.
We discuss a device for real time compensation of image quality deterioration induced by atmospheric turbulence. The device will permit ground based observations with very high image resolution. We propose an instrument with two channels. One is an ordinary image detection channel, while the other uses a Hartmann-Shack wavefront detector to measure image degradation. This information is obtained in the form of a set of lenslet focus shifts, each corresponding to the local tilt of the wavefront. Through modeling, the entire wavefront is reconstructed. Consequently, we can estimate the optical transfer function and its corresponding point spread function. Through convolution techniques, the distorted image can subsequently be restored. Thus, image correction is performed in software, eliminating the need for expensive live optics designs. Due to the nature of atmospheric turbulence, detection and correction have to be made with 50 - 100 frames per second. This implies a need for very high computing capacity. A study of the mathematical operations involved has been made with special emphasis on implementation in the hardware architecture known as radar video image processor (RVIP). This hardware utilizes a high degree of parallelism. Results available show that RVIP together with complementary units provide the necessary high-speed computing capacity. The detection system in both channels must meet very high demands. We mention high quantum efficiency, fast readout at low noise levels and a wide spectral range. A preliminary investigation evaluates suitable detectors. ICCDs are so far the most promising candidates.
An estimation-based method for accommodating nonuniform flat-field response of a focal-plane array is described. This method employs image data directly for performing the flat-field correction and does not rely on a separate flat- field calibration-measurement. This is accomplished by dithering the camera so that the object's focal-plane images acquired in a series of snapshots appear in different positions against the fixed-pattern artifacts caused by nonuniformity of the focal-plane array.
In modal compensation of atmospheric turbulence using Zernike polynomials, aliasing has been found to be serious for large sub- aperture configurations. In order to reduce the influence of aliasing on the residual error after modal correction, we have trained a neural network (NN) using simulated array images from a modified Hartmann- Shack wavefront sensor. The array images are derived from simulated atmospheric wavefronts following Kolmogorov turbulence. We find that Zernike coefficients predicted by the NN are more accurate than conventional methods. Using the first 28 Zernike modes, the residual error after modal-NN correction is nearly halved compared to what is obtained with a least-squares solution. In addition, the computation time using the NN is well suitable for real-time application.
Atmospheric wavefront residual errors for Zernike compensation are calculated using both the spatial domain approach and the frequency domain approach. It is found that the approaches give identical results. The results are used to examine numerical solutions of atmospheric Karhunen-Loeve functions through the spatial domain approach (solving an integral equation) and the frequency domain approach (diagonalization of the Noll matrix). The obtained Karhunen- Loeve eigenvalues and eigenfunctions are used to simulate atmospheric wavefronts from different methods is discussed. We find that from a statistical point of view, the method described here is the best one for atmospheric wavefront simulation. The structure functions calculated from simulated wavefronts are used to obtain the Strehl ratio for partial correction so that a validity range can be inferred for the Marechal approximation.
Conventional modal compensation using Zernike polynomial expansion by means of Hartmann-Shack wavefront sensors is discussed. Cross-talk problems of Zernike coefficients have been found to be serious, especially for large sub-aperture configurations. A proposal for use of a modified Hartmann-Shack wavefront sensor and iterative wavefront reconstruction is presented and a comparison is made between conventional and proposed methods.