Adaptive optics (AO) systems deliver high-resolution images that may be ideal for precisely measuring positions of stars (i.e., astrometry) if the system has stable and well-calibrated geometric optical distortions. A calibration unit equipped with a back-illuminated pinhole mask can be utilized to measure instrumental optical distortions. AO systems on the largest ground-based telescopes, such as the W. M. Keck Observatory and the Thirty Meter Telescope (TMT), require pinhole positions known to be ∼20 nm to achieve an astrometric precision of 0.001 of a resolution element. In pursuit of that goal, we characterize a photolithographic pinhole mask and explore the systematic errors that result from different experimental setups. We characterized the nonlinear geometric distortion of a simple imaging system using the mask, and we measured 857-nm root mean square of optical distortion with a final residual of 39 nm (equivalent to 20 μ for TMT). We use a sixth-order bivariate Legendre polynomial to model the optical distortion and allow the reference positions of the individual pinholes to vary. The nonlinear deviations in the pinhole pattern with respect to the manufacturing design of a square pattern are 47.2 nm ± 4.5 nm (random) ± 10.8 nm (systematic) over an area of 1788 mm2. These deviations reflect the additional error induced when assuming that the pinhole mask is manufactured perfectly square. We also find that ordered mask distortions are significantly more difficult to characterize than random mask distortions as the ordered distortions can alias into optical camera distortion. Future design simulations for astrometric calibration units should include ordered mask distortions. We conclude that photolithographic pinhole masks are >10 times better than the pinhole masks deployed in first-generation AO systems and are sufficient to meet the distortion calibration requirements for the upcoming 30-m-class telescopes.
The InfraRed Imaging Spectrograph (IRIS) is a first-light instrument for the Thirty Meter Telescope (TMT) that will be used to sample the corrected adaptive optics field by NFIRAOS with a near-infrared (0.8 - 2.4 µm) imaging camera and Integral Field Spectrograph (IFS). In order to understand the science case specifications of the IRIS instrument, we use the IRIS data simulator to characterize photometric precision and accuracy of the IRIS imager. We present the results of investigation into the effects of potential ghosting in the IRIS optical design. Each source in the IRIS imager field of view results in ghost images on the detector from IRIS’s wedge filters, entrance window, and Atmospheric Dispersion Corrector (ADC) prism. We incorporated each of these ghosts into the IRIS simulator by simulating an appropriate magnitude point source at a specified pixel distance, and for the case of the extended ghosts redistributing flux evenly over the area specified by IRIS’s optical design. We simulate the ghosting impact on the photometric capabilities, and found that ghosts generally contribute negligible effects on the flux counts for point sources except for extreme cases where ghosts coalign with a star of ▵m>2 fainter than the ghost source. Lastly, we explore the photometric precision and accuracy for single sources and crowded field photometry on the IRIS imager.
TMT has defined the accuracy to be achieved for both absolute and differential astrometry in its top-level requirements documents. Because of the complexities of different types of astrometric observations, these requirements cannot be used to specify system design parameters directly. The TMT astrometry working group therefore developed detailed astrometry error budgets for a variety of science cases. These error budgets detail how astrometric errors propagate through the calibration, observing and data reduction processes. The budgets need to be condensed into sets of specific requirements that can be used by each subsystem team for design purposes. We show how this flowdown from error budgets to design requirements is achieved for the case of TMT's first-light Infrared Imaging Spectrometer (IRIS) instrument.
The Thirty Meter Telescope (TMT) first light instrument IRIS (Infrared Imaging Spectrograph) will complete its preliminary design phase in 2016. The IRIS instrument design includes a near-infrared (0.85 - 2.4 micron) integral field spectrograph (IFS) and imager that are able to conduct simultaneous diffraction-limited observations behind the advanced adaptive optics system NFIRAOS. The IRIS science cases have continued to be developed and new science studies have been investigated to aid in technical performance and design requirements. In this development phase, the IRIS science team has paid particular attention to the selection of filters, gratings, sensitivities of the entire system, and science cases that will benefit from the parallel mode of the IFS and imaging camera. We present new science cases for IRIS using the latest end-to-end data simulator on the following topics: Solar System bodies, the Galactic center, active galactic nuclei (AGN), and distant gravitationally-lensed galaxies. We then briefly discuss the necessity of an advanced data management system and data reduction pipeline.
IRIS (InfraRed Imaging Spectrograph) is a first light near-infrared diffraction limited imager and integral field
spectrograph being designed for the future Thirty Meter Telescope (TMT). IRIS is optimized to perform astronomical
studies across a significant fraction of cosmic time, from our Solar System to distant newly formed galaxies (Barton et
al. ). We present a selection of the innovative science cases that are unique to IRIS in the era of upcoming space and
ground-based telescopes. We focus on integral field spectroscopy of directly imaged exoplanet atmospheres, probing
fundamental physics in the Galactic Center, measuring 104 to 1010 M supermassive black hole masses, resolved
spectroscopy of young star-forming galaxies (1 < z < 5) and first light galaxies (6 < z < 12), and resolved spectroscopy
of strong gravitational lensed sources to measure dark matter substructure. For each of these science cases we use the
IRIS simulator (Wright et al. , Do et al. ) to explore IRIS capabilities. To highlight the unique IRIS capabilities, we
also update the point and resolved source sensitivities for the integral field spectrograph (IFS) in all five broadband
filters (Z, Y, J, H, K) for the finest spatial scale of 0.004" per spaxel. We briefly discuss future development plans for the
data reduction pipeline and quicklook software for the IRIS instrument suite.
The TMT first light Adaptive Optics (AO) facility consists of the Narrow Field Infra-Red AO System (NFIRAOS) and the associated Laser Guide Star Facility (LGSF). NFIRAOS is a 60 × 60 laser guide star (LGS) multi-conjugate AO (MCAO) system, which provides uniform, diffraction-limited performance in the J, H, and K bands over 17-30 arc sec diameter fields with 50 per cent sky coverage at the galactic pole, as required to support the TMT science cases. NFIRAOS includes two deformable mirrors, six laser guide star wavefront sensors, and three low-order, infrared, natural guide star wavefront sensors within each client instrument. The first light LGSF system includes six sodium lasers required to generate the NFIRAOS laser guide stars. In this paper, we will provide an update on the progress in designing, modeling and validating the TMT first light AO systems and their components over the last two years. This will include pre-final design and prototyping activities for NFIRAOS, preliminary design and prototyping activities for the LGSF, design and prototyping for the deformable mirrors, fabrication and tests for the visible detectors, benchmarking and comparison of different algorithms and processing architecture for the Real Time Controller (RTC) and development and tests of prototype candidate lasers. Comprehensive and detailed AO modeling is continuing to support the design and development of the first light AO facility. Main modeling topics studied during the last two years include further studies in the area of wavefront error budget, sky coverage, high precision astrometry for the galactic center and other observations, high contrast imaging with NFIRAOS and its first light instruments, Point Spread Function (PSF) reconstruction for LGS MCAO, LGS photon return and sophisticated low order mode temporal filtering.
The Thirty Meter Telescope (TMT) with its first-light multi-conjugate adaptive optics system, NFIRAOS, and high-resolution imager, IRIS, is expected to take differential astrometric measurements with an accuracy on the order of tens of micro arcsec. This requires the control, correction, characterization and calibration of a large number of error sources and uncertainties, many of which have magnitudes much in excess of this level of accuracy. In addition to designing the observatory such that very high precision and accuracy astrometric observations are enabled, satisfying the TMT requirements can only be achieved by a careful calibration, observation and data reduction strategy. In this paper, we present descriptions of the individual errors sources, how and when they apply to different astrometry science cases and the mitigation methods required for each of them, as well as example results for individual error terms and the overall error budgets for a variety of different science cases.
During the site testing campaign for the Thirty Meter Telescope (TMT) in addition to the optical conditions of the atmosphere, measurements of the soil surface properties were obtained also. The dust concentration in the air was measured by means of dust sensors which were mounted underneath the mount of the site monitoring telescopes. The ground head fluxes and soil temperatures were measured several centimeters into the ground. On Cerro Armazones it was also possible to conduct an experiment to measure heat conduction of the soil. In this paper, all of these measurements are described, the results and their potential use is summarized.
We provide an update on the development of the first light adaptive optics systems for the Thirty Meter Telescope
(TMT) over the past two years. The first light AO facility for TMT consists of the Narrow Field Infra-Red AO
System (NFIRAOS) and the associated Laser Guide Star Facility (LGSF). This order 60 × 60 laser guide star
(LGS) multi-conjugate AO (MCAO) architecture will provide uniform, diffraction-limited performance in the
J, H, and K bands over 17-30 arc sec diameter fields with 50 per cent sky coverage at the galactic pole, as
is required to support TMT science cases. Both NFIRAOS and the LGSF have successfully completed design
reviews during the last twelve months. We also report on recent progress in AO component prototyping, control
algorithm development, and system performance analysis.
Between February and April 2009 a number of ultrasonic anemometers, temperature probes and dust sensors were
operated inside the CTIO Blanco telescope dome. These sensors were distributed in a way that temperature and
3 dimensional wind speeds were monitored along the line of sight of the telescope. During telescope operations,
occasional seeing measurements were obtained using the Mosaic CCD imager and the CTIO site monitoring MASS-DIMM
system. In addition, also a Lunar Scintillometer (LuSci) was operated over the course of a few nights inside the
dome. We describe the instrumental setup and first preliminary results on the linkage of the atmospheric conditions
inside the dome to the overall image quality.
As part of a program to measure and evaluate atmospheric turbulence on mountains at the most northerly tip of North
America, we have deployed two SODARs and a lunar scintillometer at the Polar Environment Atmospheric Research
Lab (PEARL) located on a 600m-high ridge near Eureka on Ellesmere Island, at 80° latitude. This paper discusses the
program and presents a summary of ground-layer turbulence and seeing measurements from the 2009-10 observing
Atmospheric optical turbulence is the main driver of wavefront distortions which affect optical telescope performance.
Therefore, many techniques have been developed to measure the optical turbulence strength along the line of sight.
Based on data collected with the MASS (Multi Aperture Scintillation Sensor), we show that a large sample of such
measurements can be used to assess the average three dimensional turbulence distribution above ground.
The use of, and a more sophisticated instrumental setup for, such turbulence tomography will be discussed.
With the development of increasingly larger and more complex telescopes and instrumentation, site testing and
characterization efforts also increase in both magnitude and complexity. This happens because the investment
into larger observatories is higher and because new technologies, such as adaptive optics, require knowledge about
parameters that did not matter previously, such as the vertical distribution of turbulence. We present examples
of remaining questions which, to date, are not generally addressed by "standard" site characterization efforts,
either because they are technically not (yet) feasible or because they are impractical. We center our observations
around the experience gained during the Thirty Meter Telescope (TMT) site testing effort with an emphasis
on turbulence measurements, but our findings are applicable in general to other current and future projects as
Adaptive optics (AO) is essential for many elements of the science case for the Thirty Meter Telescope (TMT). The
initial requirements for the observatory's facility AO system include diffraction-limited performance in the near IR, with
50 per cent sky coverage at the galactic pole. Point spread function uniformity and stability over a 30 arc sec field-ofview
are also required for precision photometry and astrometry. These capabilities will be achieved via an order 60×60
multi-conjugate AO system (NFIRAOS) with two deformable mirrors, six laser guide star wavefront sensors, and three
low-order, IR, natural guide star wavefront sensors within each client instrument. The associated laser guide star facility
(LGSF) will employ 150W of laser power at a wavelength of 589 nm to generate the six laser guide stars.
We provide an update on the progress in designing, modeling, and validating these systems and their components over
the last two years. This includes work on the layouts and detailed designs of NFIRAOS and the LGSF; fabrication and
test of a full-scale prototype tip/tilt stage (TTS); Conceptual Designs Studies for the real time controller (RTC) hardware
and algorithms; fabrication and test of the detectors for the
laser- and natural-guide star wavefront sensors; AO system
modeling and performance optimization; lab tests of wavefront sensing algorithms for use with elongated laser guide
stars; and high resolution LIDAR measurements of the mesospheric sodium layer. Further details may be found in
specific papers on each of these topics.
Seeing stability is an important criterion of site characterization. Two sites, with the same seeing statistics, could in
principle differ in their temporal stability and hence have their observatories perform differently. Temporal variability
can, however, be defined in several ways, all of which may determine the performance of the observatories in different
manner. In this paper, we propose three methods to measure variability each focusing on different applications: Selection
(maximization of observation time), Image quality (seeing variation within a given integration time) and finally
Scheduling (prediction of seeing fluctuation on a given time scale). We apply these methods to the seeing of the TMT
candidate sites to determine their stability properties.
Light pollution can create difficulties for astronomers attempting to observe faint objects in the night sky. Light
from a local small town can be just as intrusive as light from a large city in the distance. As the population
of the Earth increases, light pollution will become more of a problem, even in remote areas. The Thirty Meter
Telescope site testing program has measured light pollution at the candidate sites by using all sky cameras;
an analysis procedure enhances the all sky camera images to make the determination of the effects of the light
pollution. This paper summarizes the light pollution analysis procedure and current results, which are that light
pollution is currently unimportant for TMT to select a site for the final telescope location.
The Thirty Meter Telescope (TMT) project has been collecting data on five candidate sites since 2003. This paper
describes the site testing portion of the TMT site selection program and the process and standards employed
by it. This includes descriptions of the candidate sites, the process by which they were identified, the site
characterization instrument suite and its calibration and the available results, which will be published shortly.
All Sky Cameras were deployed at all Thirty Meter Telescope (TMT) candidate sites. The images gathered
by these cameras were used to assess the cloud statistics for each site. We describe two methods that were
developed to do this, a manual method based on inspection of blue and red movies, and an automated method
based on photometric analysis of the images.
One of the main tools used in the TMT site testing campaign is the turbulence profiler MASS. We describe
empirical investigations and a side by side comparison of two MASS systems which were performed in order to
identify the accuracy of MASS turbulence data and its dependence on the instrument calibration. The accuracy
of the total seeing delivered by the TMT MASS systems is found to be better than 0"05. The combination of
MASS and DIMM allows to observe the seeing within the first few hundred meters of the atmosphere and can be
used to investigate possible correlations with meteorological parameters measured close to the ground. We also
compare the detection of clouds and cirrus by means of MASS data (LOSSAM method) with measurements of
the thermal emission of clouds using a net radiation sensor. These methods are compared with the visual cloud
detection using all sky cameras.
Differential Image Motion Monitors (DIMMs) have become the industry standard for astronomical site characterization.
The calibration of DIMMs is generally considered to be routine, but we show that particular care
must be paid to this issue if high accuracy measurements are to be achieved. In a side by side comparison of
several DIMMs, we demonstrate that with proper calibration we can characterize the seeing to better than ±0.02
The Thirty Meter Telescope (TMT) project is currently testing six remote sites as candidates for the final location
of the telescope. Each site has several instruments, including seeing monitors, weather stations, and turbulence
profile measuring systems, each of which is computer controlled. As the sites are remote (usually hours from
the nearest town), they requires a system that can control the operations of all the varied subsystems, keep the
systems safe from damage and recover from errors during operation. The robotic system must also be robust
enough to operate without human intervention and when internet connections are lost. It is also critical that a
data archiving system diligently records all data as gathered. This paper is a discussion of the TMT site testing
robotic computer system as implemented.
The Thirty Meter Telescope (TMT) site testing team are developing a suite of instruments to measure the atmospheric and optical characteristics of candidate TMT sites. Identical sets of robotically operating instruments will be placed at each candidate site. The fully developed system will comprise of a combined MASS/DIMM. a SODAR, tower mounted thermal probes and a portable DIMM. These instruments have overlapping altitude coverage and provide a measure of the C2n profile from the ground up with sufficient resolution to make conclusions about the ground layer and high altitude turbulence characteristics. The overlapping altitude coverage is essential to ensure consistency between these very different instruments. In addition to checking for consistency in the overlap regions, procedures are being used to cross check between instruments, i.e. the calculation of the isoplanatic angle from both the MASS and DIMM and that the integrals of the C2n profiles from the MASS, SODAR and 30m tower gives the same r0 value as measured by the DIMM.
We discuss a variation of the traditional DIMM system in which we employ a continuous drift mode readout technique giving a maximum of nearly 300 samples per second.
Findings of our major equipment testing campaigns and first field deployment are presented that demonstrate our progress in developing a rigorous approach to site testing.
KEYWORDS: Observatories, Telescopes, Thirty Meter Telescope, Imaging spectroscopy, Space telescopes, Turbulence, Astronomical telescopes, Large telescopes, Atmospheric modeling, Global system for mobile communications
I describe the procedures used during the different phases of site testing for astronomical telescopes, that is, during the pre-selection of candidate sites, on-site testing (site selection), the site decision itself and characterization of the telescope site once the decision has been made. Many of the important parameters for astronomical night-time telescopes and some of the methods that can be used to determine these parameters are described. There appear to be no fundamental differences between the methods used for telescopes in the 5 to 10 meter range and telescopes with diameters larger than 10 meters, although specific differences certainly exist.
We briefly recall the principle of the polychromatic laser guide star, which aims at providing measurements of the tilt of incoming wavefronts with a 100% sky coverage, We describe the main results of the feasibility study of this concept undertaken within the ELP-OA porgramme. We finally summarize our plans for a full demonstrator at Observatoire de Haute-Provence.
The wavefront sensors of adaptive optics systems of astronomical telescopes collect an abundance of high temporal resolution information about the distortions that are introduced to the incoming wavefront by atmospheric turbulence. Although this information can theoretically be used to analyze the turbulence conditions above the telescope at the given time, it is often discarded. The reason for this dismissal of seemingly useful information is usually the difficulty of separating atmospheric and instrumental contributions to the wavefront sensor measurements and thus of obtaining reliable estimates of the atmospheric turbulence conditions. In this paper we describe an effort to overcome these problems for wavefront sensor measurements taken by the Keck telescopes on Mauna Kea. We discuss different methods of deriving turbulence parameters, such as coherence length and time and the outer scale of turbulence, and present first results.
The California Extremely Large Telescope, CELT, is a proposed 30-m telescope. Choosing the best possible site for CELT is essential in order to extract the best science from the observations and to reduce the complexity of the telescope. Site selection is therefore currently one of the most critical pacing items of the CELT project. In this paper, we first present selected results from a survey of the atmospheric transparency at optical and infrared wavelengths over the southwestern USA and northern Mexico using satellite data. Results of a similar study of South America have been reported elsewhere. These studies will serve as the pre-selection criterion of the sites at which we will perform on-site testing. We then describe the current status of on-site turbulence evaluation efforts and the future plans of the CELT site testing program.
We describe the current status of the ELP-OA project in which we try to demonstrate in practice that it is possible to measure the tilt of a wave front using only a polychromatic laser guide star and no natural guide star. The first phase of ELP-OA, consisting of feasibility experiments, has recently been completed successfully. This paper provides an overview over the results of this first phase and over the continuation of the ELP-OA project.
Adaptive optics at astronomical telescopes aims at correcting in real time the phase corrugations of incoming wavefronts caused by the turbulent atmosphere, as early proposed by Babcock. Measuring the phase errors requires a bright source located within the isoplanatic patch of the program source. The probability that such a reference source exists is a function of the wavelength, of the required image quality (Strehl ratio), of the turbulence optical properties, and of the direction of the observation. It turns out that the sky coverage is disastrously low in particular in the visible wavelength range where, unfortunately, the gain in spatial resolution brought by adaptive optics is the largest. Foy and Labeyrie have proposed to overcome this difficulty by creating an artificial point source in the sky in the direction of the observation relying on the backscattered light due to a laser beam. This laser guide star (hereinafter referred to as LGS) can be bright enough to allow us to accurately measure the wavefront phase errors, except for two modes which are the piston (not relevant in this case) and the tilt. Pilkington has emphasized that the round trip time of the laser beam to the mesosphere, where the LGS is most often formed, is significantly shorter than the typical tilt coherence time; then the inverse-return-of-light principle causes deflections of the outgoing and the ingoing beams to cancel. The apparent direction of the LGS is independent of the tilt. Therefore the tilt cannot be measured only from the LGS. Until now, the way to overcome this difficulty has been to use a natural guide star to sense the tilt. Although the tilt is sensed through the entire telescope pupil, one cannot use a faint source because $APEX 90% of the variance of the phase error is in the tilt. Therefore, correcting the tilt requires a higher accuracy of the measurements than for higher orders of the wavefront. Hence current adaptive optics devices coupled with a LGS face low sky coverage. Several methods have been proposed to get a partial sky coverage for the tilt. The only one providing us with a full sky coverage is the polychromatic LGS (hereafter referred to as PLGS). We present here a progress report of the R&D program Etoile Laser Polychromatique et Optique Adaptative (ELP-OA) carried out in France to develop the PLGS concept. After a short recall of the principles of the PLGS, we will review the goal of ELP-OA and the steps to get over to bring it into play. We finally shortly described the effort in Europe to develop the LGS.
We present results from measurements of the return flux from a polychromatic sodium laser guide star produced in Pierrelatte, France during the PASS-2 experiment. In the experiment, photometry of light at 330, 569, 589, and 589.6 nm emitted by mesospheric sodium under two-color laser excitation (569 and 589 nm) was performed. The variation of oscillator and laser configurations as well as simultaneous measurements of the atmospheric coherence length and the mesospheric sodium density permit a comparison of the results with atomic physics models. Using the results, we can determine the setup that produces the maximum return flux from the polychromatic laser guide star. The knowledge gained will be used to aid the ELP- OA project, which has as its goal the design, testing, and implementation of an adaptive optics system that uses a polychromatic laser guide star for wave front tilt measurements.
Adaptive optics at astronomical telescopes aims at correcting in real time the phase corrugations of incoming wavefronts caused by the turbulent atmosphere, as early proposed by Babcock. Measuring the phase errors requires a bright source, which is located within the isoplanatic patch of the program source. The probability that such a reference source exists is a function of the wavelength of the observation, of the required image quality (Strehl ratio), of the turbulence optical properties, and of the direction of the observation. Several papers have addressed the problem of the sky coverage as a function of these parameters (see e.g.: Le Louarn et al). It turns out that the sky coverage is disastrously low in particular in the short (visible) wavelength range where, unfortunately, the gain in spatial resolution brought by adaptive optics is the largest. Foy and Labeyrie have proposed to overcome this difficulty by creating an artificial point source in the sky in the direction of the observation relying on the backscattered light due to a laser beam. This laser guide star (hereafter referred to as LGS) can be bright enough to allow us to accurately measure the wavefront phase errors, except for two modes which are the piston (which is not relevant in this case) and the tilt. Pilkington has emphasized that the round trip time of the laser beam to the mesosphere, where the LGS is most often formed, is significantly shorter than the typical tilt coherence time; then the inverse-return- of-light principle causes deflections of the outgoing and the ingoing beams to cancel. The apparent direction of the LGS is independent of the tilt. Therefore the tilt cannot be measured only from the LGS. Until now, the way to overcome this difficulty has been to use a natural guide star to sense the tilt. Although the tilt is sensed through the entire telescope pupil, one cannot use a faint source because approximately equals 90% of the variance of the phase error is in the tilt. Therefore, correcting the tilt requires a higher accuracy of the measurements than for higher orders of the wavefront. Hence current adaptive optics devices coupled with a LGS face low sky coverage. Several methods have been proposed to get a partial or total sky coverage for the tilt, such as the dual adaptive optics concept, the elongation perspective method, or the polychromatic LGS (hereafter referred to as PLGS). We present here a progress report of the R&D program Etoile Laser Polychromatique et Optique Adaptative (ELP-OA) carried out in France to develop the PLGS concept. After a short recall of the principles of the PLGS, we will review the goal of ELP-OA and the steps to get over to bring it into play.
We present a technique that can be used to quantify the frozen flow hypothesis with data from wavefront sensors such as those found in adaptive optics systems. The method is first tested with simulated data. Analyzing data from the 1.5-m and 3.5-m telescopes at the Starfire Optical Range, we then find that the frozen flow hypothesis is an accurate description of the temporal development of atmospheric turbulence on time scales on the order of 1-10 milliseconds, but that significant deviations from the frozen flow behavior are found for longer time scales.
PASS-2 is an experiment designed to perform photometry of the polychromatic laser guide star. The tilt of an atmospherically distorted wave front coming from an astronomical object cannot be determined with a monochromatic laser guide star. If it is possible to produce a laser guide star that emits light at different wavelengths, however, the tilt can be determined from the measurable differences between the tilts at the different wavelengths. This is the concept of the polychromatic laser guide star. The PASS-2 experiment is a step towards an implementation of an adaptive optics system that uses a polychromatic laser guide star for the wave front tilt measurement. The goal of the experiment is to validate the feasibility of a polychromatic laser guide star adaptive optics system and to determine the laser parameters that produce the optimal return flux from the polychromatic laser guide star. To this end, the return flux from the polychromatic laser guide star at 330 and 589.6 nm will be measured as a function of laser parameters, atmospheric conditions, and the density of the mesospheric sodium layer.
We show how to resolve the autocorrelation of wavefront- sensor measurements as they are commonly taken by adaptive- optics systems into terms due to individual atmospheric layers. We exploit this factorization to measure the strength and wind speed of each turbulent layer.We show that the Kolmogorov theory provides a good description of the data and find values of Fried's parameter, r0, that are in good agreement with other measurements. Finally, a quantitative estimate of the validity of Taylor's frozen flow hypothesis is obtained.