An astronomical adaptive optics test-bench, designed to replicate the conditions of a 4 m-class telescope, is presented. Named DRAGON-Next Generation, it is constructed primarily from commercial off-the-shelf components with minimal customization (approximately a 90:10 ratio). This permits an optical design which is modular and this leads to a reconfigurability. DRAGON-NG has been designed for operation for the following modes: (high-order) SCAO, (twin-DM) MOAO, and (twin-DM) MCAO. It is capable of open-loop or closed-loop operation, with (3) natural and (3) laser guide-star emulation at loop rates of up to 200Hz. Field angles of up-to 2.4 arcmin (4m pupil emulation) can pass through the system. The design is dioptric and permits long cable runs to a compact real-time control system which is on-sky compatible. Therefore experimental validation can be carried out on DRAGON-NG before transferring to an on-sky system, which is a significant risk mitigation.
We have implemented the full AO data-processing pipeline on Graphics Processing Units (GPUs), within the framework of Durham AO Real-time Controller (DARC). The wavefront sensor images are copied from the CPU memory to the GPU memory. The GPU processes the data and the DM commands are copied back to the CPU. For a SCAO system of 80x80 subapertures, the rate achieved on a single GPU is about 700 frames per second (fps). This increases to 1100 fps (1565 fps) if we use two (four) GPUs. Jitter exhibits a distribution with the root-mean-square value of 20 μs–30 μs and a negligible number of outliers. The increase in latency due to the pixel data copying from the CPU to the GPU has been reduced to the minimum by copying the data in parallel to processing them. An alternative solution in which the data would be moved from the camera directly to the GPU, without CPU involvement, could be about 10%–20% faster. We have also implemented the correlation centroiding algorithm, which - when used - reduces the frame rate by about a factor of 2–3.
The Durham AO Real-time Controller has been used on-sky with the CANARY AO demonstrator instrument since 2010, and is also used to provide control for several AO test-benches, including DRAGON. Over this period, many new real-time algorithms have been developed, implemented and demonstrated, leading to performance improvements for CANARY. Additionally, the computational performance of this real-time system has continued to improve. Here, we provide details about recent updates and changes made to DARC, and the relevance of these updates, including new algorithms, to forthcoming AO systems. We present the computational performance of DARC when used on different hardware platforms, including hardware accelerators, and determine the relevance and potential for ELT scale systems.
Recent updates to DARC have included algorithms to handle elongated laser guide star images, including correlation wavefront sensing, with options to automatically update references during AO loop operation. Additionally, sub-aperture masking options have been developed to increase signal to noise ratio when operating with non-symmetrical wavefront sensor images. The development of end-user tools has progressed with new options for configuration and control of the system. New wavefront sensor camera models and DM models have been integrated with the system, increasing the number of possible hardware configurations available, and a fully open-source AO system is now a reality, including drivers necessary for commercial cameras and DMs.
The computational performance of DARC makes it suitable for ELT scale systems when implemented on suitable hardware. We present tests made on different hardware platforms, along with the strategies taken to optimise DARC for these systems.
The main goal of Green Flash is to design and build a prototype for a Real-Time Controller (RTC) targeting the European Extremely Large Telescope (E-ELT) Adaptive Optics (AO) instrumentation. The E-ELT is a 39m diameter telescope to see first light in the early 2020s. To build this critical component of the telescope operations, the astronomical community is facing technical challenges, emerging from the combination of high data transfer bandwidth, low latency and high throughput requirements, similar to the identified critical barriers on the road to Exascale. With Green Flash, we will propose technical solutions, assess these enabling technologies through prototyping and assemble a full scale demonstrator to be validated with a simulator and tested on sky. With this R&D program we aim at feeding the E-ELT AO systems preliminary design studies, led by the selected first-light instruments consortia, with technological validations supporting the designs of their RTC modules. Our strategy is based on a strong interaction between academic and industrial partners. Components specifications and system requirements are derived from the AO application. Industrial partners lead the development of enabling technologies aiming at innovative tailored solutions with potential wide application range. The academic partners provide the missing links in the ecosystem, targeting their application with mainstream solutions. This increases both the value and market opportunities of the developed products. A prototype harboring all the features is used to assess the performance. It also provides the proof of concept for a resilient modular solution to equip a large scale European scientific facility, while containing the development cost by providing opportunities for return on investment.
CANARY is an on-sky Laser Guide Star (LGS) tomographic AO demonstrator in operation at the 4.2m William Herschel Telescope (WHT) in La Palma. From the early demonstration of open-loop tomography on a single deformable mirror using natural guide stars in 2010, CANARY has been progressively upgraded each year to reach its final goal in July 2015. It is now a two-stage system that mimics the future E-ELT: a GLAO-driven woofer based on 4 laser guide stars delivers a ground-layer compensated field to a figure sensor locked tweeter DM, that achieves the final on-axis tomographic compensation. We present the overall system, the control strategy and an overview of its on-sky performance.
The performances of high resolution magnetic deformable mirrors have been recently improved: the mechanical bandwidth has been increased to 2 kHz, and a fast stroboscopic Shack-Harman wavefront was used to measure a settling time as low as 400μs. Recent improvements in the substrate-thinning processes made possible the availability of large, high-quality membranes compatible with deformable mirrors. Prototype testing and simulations show that devices with up to 60x60 actuators are now possible. For open-loop operations, a novel feed-forward algorithm was developed to compensate for residual creeping and improve the DM stability to below10nm RMS over 6 hours.
CANARY is an on-sky Laser Guide Star (LGS) tomographic AO demonstrator that has been in operation at the 4.2m William Herschel Telescope (WHT) in La Palma since 2010. In 2013, CANARY was upgraded from its initial configuration that used three off-axis Natural Guide Stars (NGS) through the inclusion of four off-axis Rayleigh LGS and associated wavefront sensing system. Here we present the system and analysis of the on-sky results obtained at the WHT between May and September 2014. Finally we present results from the final ‘Phase C’ CANARY system that aims to recreate the tomographic configuration to emulate the expected tomographic AO configuration of both the AOF at the VLT and E-ELT.
DRAGON is a high order, wide field AO test-bench at Durham. A key feature of DRAGON is the ability to be operated at real-time rates, i.e. frame rates of up to 1kHz, with low latency to maintain AO performance. Here, we will present the real-time control architecture for DRAGON, which includes two deformable mirrors, eight wavefront sensors and thousands of Shack-Hartmann sub-apertures. A novel approach has been taken to allow access to the wavefront sensor pixel stream, reducing latency and peak computational load, and this technique can be implemented for other similar wavefront sensor cameras with no hardware costs. We report on experience with an ELT-suitable wavefront sensor camera. DRAGON will form the basis for investigations into hardware acceleration architectures for AO real-time control, and recent work on GPU and many-core systems (including the Xeon Phi) will be reported. Additionally, the modular structure of DRAGON, its remote control capabilities, distribution of AO telemetry data, and the software concepts and architecture will be reported. Techniques used in DRAGON for pixel processing, slope calculation and wavefront reconstruction will be presented. This will include methods to handle changes in CN2 profile and sodium layer profile, both of which can be modelled in DRAGON. DRAGON software simulation techniques linking hardware-in-the-loop computer models to the DRAGON real-time system and control software will also be discussed. This tool allows testing of the DRAGON system without requiring physical hardware and serves as a test-bed for ELT integration and verification techniques.