## 1.

## Introduction

The Shack-Hartmann wavefront sensor (SHWFS) has been widely applied in adaptive optics (AO) systems, optical testing systems, and aberration measurements of the human eye.^{1, 2, 3, 4} Its performance decides the detecting ability and the wavefront measurement precision of the whole system.^{5} Unlike interferometers, it does not require a reference wave during the measurement, and thus it is more stable and easier to use.^{6, 7, 8} The measurement sensitivity of an SHWFS mainly depends on the light spot centroid detection accuracy, which is affected by many factors such as the sampling and truncation error of the wavefront, the read-out noise of the detector, and the photon noise as well as the background noise.^{9, 10}

There are several algorithms that can find light spot centroid positions in an SHWFS.^{11} The conventional centroid detection algorithm was based on the center of weight calculation.^{12} Due to the strong influence of noise, the conventional centroid detection algorithm of weight results in a relatively high systemic and random error, especially in the case where the centroid of the spot is not at the center of the detection area.^{13} In fact, the spot can overstep the subaperture window when the atmospheric turbulence is very strong or the aberration of the human eye is very large, which results in a larger detection error and thus makes the measurement invalid. Therefore, investigating a new centroid detection algorithm to expand the dynamic range of an SHWFS is significant for the measurement of the aberration of the human eye and atmospheric turbulence, especially when the AO technique is introduced in retina imaging.^{14}

In this work, a new spot centroid detection algorithm based on the dynamic tracking principle is presented, and its advantage is that it uses only software to correctly attribute each spot to the correct lenslet and expand the dynamic range of SHWFS without any mechanisms such as a spatial light modulator array^{15} or astigmatic microlenses,^{16} etc. The principle of the dynamic tracking centroid detection algorithm is described in Sec. 2. The experimental results of the dynamic range test are discussed in Sec. 3. The wavefront correction for retina imaging with the new SHWFS was conducted and is discussed in Sec. 4. Finally, a conclusion is given in Sec. 5.

## 2.

## Operation Theory of Shack-Hartmann Wavefront Sensor and the New Centroid Detection Algorithm

## 2.1.

### Principle of a Conventional Shack-Hartmann Wavefront Sensor

Figure 1
shows the principle of an SHWFS, which mainly comprises a lenslet array and a charge-coupled device (CCD) detector. The microlens array is composed of a number of small convex lenses with the same focal length and aperture size. The CCD sensor is located at the focal plane of the microlenslet array. The SHWFS spatially divides the incident wavefront into many small subapertures with the lenslet array.^{17} Each subaperture forms a spot on the CCD detector, and then a focused light spot array pattern can be obtained by CCD. The relative displacement of the centroid of each spot represents some tilt of the incident wavefront through the subapertures.^{18} The centroid position of each spot can be calculated by the formula:^{19}

## 1.

With the centroid positions obtained according to Eq. 1, the corresponding wavefront slope of every subaperture in the
$X$
and
$Y$
directions can be expressed as^{20}:

## 2.

The conventional centroid detection algorithm is very easily understood and applied. Assume that there is a CCD with $M1\times M1\phantom{\rule{0.3em}{0ex}}\text{pixels}$ and a microlens array with $M2\times M2$ microlenses, and $Q\times Q$ pixels are distributed in the subaperture of each microlens $(Q=M1\u2215M2)$ . In the detection area, the centroid position of the spot is calculated by Eq. 1. However, the centroid detection error will be very large in some cases, as shown in Fig. 2 . Figure 2 shows that a spot leaves from its subaperture to its neighbor, and Fig. 2 shows a spot overlapping with its adjacent spot. Both cases will result in a large centroid detection error when using the conventional centroid detection algorithm. Therefore, investigating a centroid detection algorithm to expand the dynamic range of a SHWFS is very necessary.

## 2.2.

### Dynamic Tracking Centroid Detection Algorithm

The new centroid detection algorithm proposed in this work is based on the idea that the sampling time interval of a WFS is shorter than the correlation time of the measured objects such as atmospheric disturbance and human eye aberration, so the wavefront slope and the centroid of the corresponding spot are strongly correlated between temporally adjacent SHWFS measurements. According to the statistical relationship, the movement of the corresponding spot centroid will usually fall within some range between the adjacent measurements.^{17, 21, 22} Therefore, the current centroid is a good prediction of the spot location in the subsequent measurement. The steps of the algorithm are listed next.

### Step 1

In a
$(Q\times Q)$
square area, which is the area of pixels distributed to each subaperture, focus on the centroid of the standard reference wavefront and find the brightest point position of the spot of each subaperture. Focus on the brightest point position of each spot, and the centroid will be calculated by Eq. 1 in an
$(N\times N)$
square area centroid detection window. If there is more than one pixel that outputs the maximum gray level (ADU) within a spot, focus on one of the pixels with the maximum gray level, and the initial centroid position of each spot is calculated by Eq. 1 in an
$(N\u22152\times N\u22152)$
square area centroid detection window. Then focus on the initial centroid position, and the centroid of each spot will be calculated by Eq. 1 in an
$(N\times N)$
square area centroid detection window. To adaptively center the window the centroid is calculated on, the algorithm includes the use of the pixels with highest signal and signal-to-noise ratio (SNR), and excludes pixels with negligible signal that only contribute undesired noise, thus improving accuracy.^{23, 24} The purpose of this step is to calculate the spot centroids of the first spot map, and it is the premise to track and calculate the spot centroids of the subsequent spot map dynamically.
$N$
is calculated by the following expressions:

However, the peak of each spot of the first spot map does not leave its centroid detection window and creep into the adjacent window. If this situation occurs, the centroid precision of the first spot map will be relatively low, and the centroid track of the subsequent spot maps will fail.

### Step 2

According to the theory of temporal correlation for the atmospheric turbulence and human eye aberration, there will be a statistical relationship for the spot centroid of the same subaperture between two continuous spot images sampled within the correlation time. That is to say, the relative spot movement of each subaperture between two continuous images is within a finite range, which can by calculated by^{22}:

Based on each spot centroid of the first spot map, which is calculated by step 1, select an appropriate size $(M\times M)$ square area and find the brightest point or calculate the initial centroid position in the square area. $M$ is calculated with the following expression:

$M$ should be the ceiling of the result of Eq. 9.Focus on the position of the brightest point or the initial centroid, select an appropriate size $(N\times N)$ square area to be regarded as the centroid detection window, and the centroid of each spot will be calculated by Eq. 1 in the corresponding centroid detection window. $N$ is obtained by Eq. 7. The purpose of this step is to calculate the spot centroids of the second spot map, and prepare for tracking and calculating the spot centroids of the next spot map dynamically. The peak of each spot of the second spot map may leave its centroid detection window and creep into the adjacent window, and may even be located in the center of the adjacent window.

### Step 3

Repeat step 2 to track and reconstruct the dynamic wavefront.

Figure 3 shows the principle of the algorithm. The centroid position detection principle of the first spot map and the subsequent spot maps are shown in Figs. 3 and 3, respectively. In Fig. 3, the crisscross star represents the centroid of the standard reference wavefront, the green solid pane represents the area of pixels distributed to each subaperture, the black solid pane represents the centroid detection window of each spot, and the black dashed pane represents the window where the brightest point or the initial centroid position of each spot of the current spot map is found based on the corresponding spot centroid of the previous spot map. The white-black disk represents the spot, and the red point in the white-black disk represents the brightest point or the initial centroid position of each spot. (Color online only.)

## 3.

## Experiments and Analysis of Dynamic Range Test

To verify the dynamic range of this method, we developed new control software for an SHWFS made by our group with Visual C++ using the new centroid detection algorithm, and we developed control software for it using the conventional center of weight method. This sensor offers a $15\times 15$ microlens array and $256\times 256\phantom{\rule{0.3em}{0ex}}\text{pixel}$ CCD. The dynamic range measurement results using the new method are compared with those using the conventional algorithm. Figure 4 shows the schematics of the experimental setup.

We translated the laser diode (LD) along the direction perpendicular to the optical axis to generate the tilt aberration to verify the dynamic range of our dynamic tracking centroid algorithm, and the movement step was $1\phantom{\rule{0.3em}{0ex}}\mathrm{mm}$ every time. The distance between the sensor and the LD was $243\phantom{\rule{0.3em}{0ex}}\mathrm{mm}$ , and the wavefronts were measured in relative mode. The theoretical peak-to-valley (PV) value of the distorted wavefront produced by the previously mentioned method was calculated by the following expression:

where $a$ is the available width of $4.356\phantom{\rule{0.3em}{0ex}}\mathrm{mm}$ for the CCD panel, $l$ is the displacement of the LD, and $L$ is the distance between the sensor and the LD. The measurement results are shown in Fig. 5 . Figure 5 shows the reference wavefront. The wavefronts shown in Fig. 5 are the ones measured by the dynamic tracking centroid algorithm, and the wavefronts shown in Fig. 5 are the ones measured by the conventional algorithm. The corresponding PV value of the wavefront is shown in Fig. 5. The wavefront measurement error for the new centroid algorithm and the conventional algorithm are shown in Fig. 5. From Fig. 5, we can see that the wavefront measurement error for the new algorithm is $-0.063\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ , while the error for the conventional algorithm is $3.873\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ when the LD moves $2\phantom{\rule{0.3em}{0ex}}\mathrm{mm}$ , and when the LD moves $6\phantom{\rule{0.3em}{0ex}}\mathrm{mm}$ , the error is $0.152\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ for the new algorithm and $64.496\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ for the conventional algorithm.In addition, we translated the LD along the direction parallel to the optical axis and toward the SHWFS to generate the defocus aberration to verify the dynamic range of the new centroid algorithm. The measurement results are shown in Fig. 6 . The wavefronts shown in Fig. 6 are the ones measured by the new centroid algorithm, and the wavefronts shown in Fig. 6 are the ones measured by the conventional algorithm. The corresponding radius of the sphere wavefront is shown in Fig. 6. The wavefront measurement error for the new centroid algorithm and the conventional algorithm are shown in Fig. 6. From Fig. 6, we can see that the wavefront measurement error for the new algorithm is $0.104\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ , while the error for the conventional algorithm is $0.930\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ when the LD moves $27.5\phantom{\rule{0.3em}{0ex}}\mathrm{cm}$ ; when the LD moves $32.5\phantom{\rule{0.3em}{0ex}}\mathrm{cm}$ , the error is $0.147\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ for the new algorithm and $30.632\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ for the conventional algorithm.

From the results shown in Figs. 5 and 6, we can see that the wavefronts measured by the conventional algorithm were generally distorted as the LD moved, while the ones measured by the dynamic tracking centroid algorithm are quite close to the theoretical ones. The large error of the wavefronts measured by the conventional algorithm is due to some spots leaving their subaperture and being located in neighboring ones.

To verify that this new algorithm is suitable for aspherical wavefront measurements, we used a liquid crystal spatial light modulator (LCSLM) to generate a mixture of spherical aberration plus coma aberration, and measured them with the SHWFS using the new centroid algorithm and the conventional algorithm. The LCSLM used in the experiment from America BNS Company (Middletown, Rhode Island) had $512\times 512\phantom{\rule{0.3em}{0ex}}\text{pixels}$ with a $2\pi $ phase modulation depth, and the size of the pixels was $15\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ . The LCSLM had a response frequency of $200\phantom{\rule{0.3em}{0ex}}\mathrm{Hz}$ . The experimental configuration is shown in Fig. 7 .

The control of the LCSLM is based on Zernike polynomials.^{25} The wavefronts measured by the SHWFS using the new algorithm and the conventional algorithm are shown in Figs. 8 and 8
, respectively, when the Zernike coefficients
${Z}_{2}=1$
and
${Z}_{5}=1$
,
${Z}_{2}=2$
and
${Z}_{5}=2\dots {Z}_{2}=9$
and
${Z}_{5}=9$
(piston is obviated;
${Z}_{0}$
,
$x$
-tilt;
${Z}_{1}$
,
$y$
-tilt;
${Z}_{2}$
, defocus;
${Z}_{3}$
,
$x$
-astigmatism;
${Z}_{4}$
,
$y$
-astigmatism;
${Z}_{5}$
,
$x$
-coma
$\dots {Z}_{35}$
;
${Z}_{m}=i$
means the
$m$
’th Zernike coefficient is set to
$i$
and the other coefficients are set to 0). The corresponding PV value of the wavefront is shown in Fig. 8. The wavefront measurement errors for the new algorithm and the conventional algorithm are shown in Fig. 8. From Fig. 8, we can see that the wavefront measurement error for the new algorithm is
$0.097\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$
, while the error for the conventional algorithm is
$0.481\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$
when the Zernike coefficients
${Z}_{2}=5$
and
${Z}_{5}=5$
; when
${Z}_{2}=8$
and
${Z}_{5}=8$
, the error is
$0.165\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$
for the new algorithm and
$6.456\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$
for the conventional algorithm; and when
${Z}_{2}=9$
and
${Z}_{5}=9$
, the error is
$4.736\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$
for the new algorithm and
$8.609\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$
for the conventional algorithm. The large wavefront measurement error for the new algorithm occurs because two or more spots come together.

From these experimental results, we can see that the new algorithm expands the dynamic range for a SHWFS by a factor of more than 3 compared with the conventional algorithm in the measurement of tilt aberration, more than 1.3 times in the measurement of defocus aberration, and more than 2 times in the measurement of the mixture of spherical aberration plus coma aberration. The centroid detection error is very large with the conventional algorithm when the tails of the spot creep into the adjacent window, while the new algorithm is still able to perform well when the spot is near the edge of the window of pixels associated with a lenslet, at which point the peak of the spot is within the window but the tails are now creeping into the adjacent window. Therefore, the wavefront could be decomposed by the new algorithm when some spots leave their subaperture and are located in neighboring ones, but it is not suitable for the situation where two or more spots come together when some kinds of aberrations are very strong, such as shown in Fig. 2.

Finally, to verify whether the algorithm also performs well for dynamic aberration measurements, a series of gray maps with simulated and real eye aberrations was sent on the prior LCSLM successively, and the wavefronts were measured by the SHWFS with the new centroid algorithm. The sending frequency was $10\phantom{\rule{0.3em}{0ex}}\mathrm{Hz}$ . The simulation eye aberrations were the mixture of defocus aberration plus astigmatism aberration plus coma aberration plus spherical aberration (the Zernike coefficients ${Z}_{2}={Z}_{3}={Z}_{4}={Z}_{5}={Z}_{6}={Z}_{7}=i$ , $i=0,0.1,0.2,0.3\dots 1,0.9,0.8,0.7\dots 0\dots $ ). The real eye aberrations were the ones of subject NZ eye. The theoretical wavefront maps and the wavefront maps measured by the SHWFS with the new centroid algorithm of the simulation eye dynamic aberrations are shown in Video 1 , and the wavefront measurement error is shown in Fig. 9 . Video 2 shows the theoretical wavefront maps and the wavefront maps measured by the SHWFS with the new centroid algorithm of the real eye dynamic aberrations. The wavefront measurement error is shown in Fig. 10 . From these experimental results, we can see that the new algorithm performs well for dynamic aberration measurements.

10.1117/1.3369810.110.1117/1.3369810.2## 4.

## Human Eye Aberration Correction Experiments

To verify the feasibility of the new algorithm for highly aberrated eyes, an AO system for retina imaging correction was implemented. Figure 11 shows the layout of the AO system. The AO system consists of a wavefront sensor, a wavefront corrector (an LCSLM), and a control computer. An LD of $808\text{-}\mathrm{nm}$ wavelength was used for illumination. The LCSLM used in this experiment was the one used in the previous experiment. The SHWFS used in the experiment was also the one used before with the control software we developed using the new algorithm. The wavefront correction was performed by the modal closed-loop algorithms using a direct wavefront compensation method. Using this system, three subjects with different myopias and astigmatisms were measured in the laboratory.

First, subject DL with 3-D myopia was tested. Figure 12 shows the wavefront maps and fundus image before and after correction. The wavefront distortion of $3.745\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ (PV value) was corrected to $0.085\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ (PV value). The fundus image after correction reached the diffraction-limited resolution. Then, subject CL with 5-D myopia was tested. The wavefront distortion in PV before and after correction was 5.444 and $0.134\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ , as shown in Figs. 13 and 13 , respectively. The fundus image is shown in Figs. 13 and 13. Finally, subject LMX with 5-D myopia and 2-D astigmatism was tested. The wavefront distortion in PV before and after correction was 9.509 and $0.216\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ , as shown in Figs. 14 and 14 , respectively. Figures 14 and 14 show the fundus image before and after correction. From these results, we see that the algorithm is feasible for highly aberrated eyes.

## 5.

## Conclusions

A dynamic tracking spot centroid detection algorithm for an SHWFS is presented and investigated experimentally. It uses each spot centroid of the previous spot map to track and calculate the corresponding spot centroid of the current spot map, according to the strong correlation of the wavefront slope and the centroid of the corresponding spot between temporally adjacent SHWFS measurements, i.e., for the adjacent measurements, the spot centroid movement will usually fall within some range. The dynamic range test results using the dynamic tracking algorithm are presented. These results indicate that the dynamic range of an SHWFS could be expanded by a factor of 3 in the measurement of tilt aberration compared with the conventional algorithm, more than 1.3 times in the measurement of defocus aberration, and more than 2 times in the measurement of the mixture of spherical aberration plus coma aberration. The experimental results also show that the new algorithm performs well for dynamic aberration measurements. The wavefronts could be decomposed by the new algorithm when some spots leave their subaperture to their neighbor ones, but it is not suitable for the situation where two or more spots come together, as when some kinds of aberrations are very strong.

In addition, an AO system for retina imaging correction is also implemented to prove the feasibility of the algorithm for highly aberrated eyes. Using this system, three subjects with differing myopias and astigmatisms are tested in the laboratory. For the subjects with 3-D myopia, 5-D myopia, 5-D myopia, and 2-D astigmatism, the wavefront distortions in PV before and after correction were 3.754 and $0.085\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ , 5.444 and $0.134\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ , 9.509 and $0.216\phantom{\rule{0.3em}{0ex}}\mu \mathrm{m}$ , respectively, and clear retina images were obtained.

## Acknowledgments

The work is supported by the National Natural Science Foundation (numbers 60578035, 50473040, and 60736042), and the Science Foundation of Jilin Province (numbers 20050520 and 20050321-2).