The NOAO Data Lab aims to provide infrastructure to maximize community use of the high-value survey datasets now being collected with NOAO telescopes and instruments. As a science exploration framework, the Data Lab allow users to access and search databases containing large (i.e. terabyte-scale) catalogs, visualize, analyze, and store the results of these searches, combine search results with data from other archives or facilities, and share these results with collaborators using a shared workspace and/or data publication service. In the process of implementing the needed tools and services, specific science cases are used to guide development of the system framework and tools. The result is a Year-1 capability demonstration that (fully or partially) implements each of the major architecture components in the context of a real-world science use-case. In this paper, we discuss how this model of science-driven development helped us to build a fully functional system capable of executing the chosen science case, and how we plan to scale this system to support general use in the next phase of the project.
Collaborative research/computing environments are essential for working with the next generations of large astronomical data sets. A key component of them is a distributed storage system to enable data hosting, sharing, and publication. VOSpace<sup>1</sup> is a lightweight interface providing network access to arbitrary backend storage solutions and endorsed by the International Virtual Observatory Alliance (IVOA). Although similar APIs exist, such as Amazon S3, WebDav, and Dropbox, VOSpace is designed to be protocol agnostic, focusing on data control operations, and supports asynchronous and third-party data transfers, thereby minimizing unnecessary data transfers. It also allows arbitrary computations to be triggered as a result of a transfer operation: for example, a file can be automatically ingested into a database when put into an active directory or a data reduction task, such as Sextractor, can be run on it. In this paper, we shall describe the VOSpace implementations that we have developed for the NOAO Data Lab. These offer both dedicated remote storage, accessible as a local file system via FUSE, and a local VOSpace service to easily enable data synchronization.
The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint effort of NOAO and the Department of Computer Science at the University of Arizona to build prototype software to process alerts from time-domain surveys, especially LSST, to identify those alerts that must be followed up immediately. Value is added by annotating incoming alerts with existing information from previous surveys and compilations across the electromagnetic spectrum and from the history of past alerts. Comparison against a knowledge repository of properties and features of known or predicted kinds of variable phenomena is used for categorization. The architecture and algorithms being employed are described.
<p> The NOAO Data Lab will allow users to efficiently utilize catalogs of billions of objects, augment traditional telescope imaging and spectral data with external archive holdings, publish high level data products of their research, share custom results with collaborators and experiment with analysis toolkits. The goal of the Data Lab is to provide a common framework and workspace for science collaborations and individuals to use and disseminate data from large surveys. </p><p> In this paper we describe the motivations behind the NOAO Data Lab and present a conceptual overview of the activities we plan to support. Specific science cases will be used to develop a prototype framework and tools, allowing us to work directly with scientists from survey teams to ensure development will remain focused on scientifically productive tasks. This will additionally develop a pool of both scientific and technical experts who can provide ongoing advice and support for community users as the scope and capabilities of the Data Lab expand. </p>
The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://www.lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions as well as additional scheduled and unscheduled downtime. It has a detailed model to simulate the external conditions with real weather history data from the site, a fully parameterized kinematic model for the internal conditions of the telescope, camera and dome, and serves as a prototype for an automatic scheduler for the real time survey operations with LSST. The Simulator is a critical tool that has been key since very early in the project, to help validate the design parameters of the observatory against the science requirements and the goals from specific science programs. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. Software to efficiently compare the efficacy of different survey strategies for a wide variety of science applications using such a growing set of metrics is under development. A recent restructuring of the code allows us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator is being used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities and assist with performance margin investigations of the LSST system.
The LSST will, over a 10-year period, produce a multi-color, multi-epoch survey of more than
18000 square degrees of the southern sky. It will generate a multi-petabyte archive of images and
catalogs of astrophysical sources from which a wide variety of high-precision statistical studies can
be undertaken. To accomplish these goals, the LSST project has developed a suite of modeling and
simulation tools for use in validating that the design and the as-delivered components of the LSST
system will yield data products with the required statistical properties. In this paper we describe the
development, and use of the LSST simulation framework, including the generation of simulated
catalogs and images for targeted trade studies, simulations of the observing cadence of the LSST, the
creation of large-scale simulations that test the procedures for data calibration, and use of end-to-end
image simulations to evaluate the performance of the system as a whole.
The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint project of the National Optical Astronomy Observatory and the Department of Computer Science at the University of Arizona. The goal is to build the software infrastructure necessary to process and filter alerts produced by time-domain surveys, with the ultimate source of such alerts being the Large Synoptic Survey Telescope (LSST). The ANTARES broker will add value to alerts by annotating them with information from external sources such as previous surveys from across the electromagnetic spectrum. In addition, the temporal history of annotated alerts will provide further annotation for analysis. These alerts will go through a cascade of filters to select interesting candidates. For the prototype, ‘interesting’ is defined as the rarest or most unusual alert, but future systems will accommodate multiple filtering goals. The system is designed to be flexible, allowing users to access the stream at multiple points throughout the process, and to insert custom filters where necessary. We describe the basic architecture of ANTARES and the principles that will guide development and implementation.
A survey program with multiple science goals will be driven by multiple technical requirements. On a ground-based
telescope, the variability of conditions introduces yet greater complexity. For a program that must be largely autonomous
with minimal dwell time for efficiency it may be quite difficult to foresee the achievable performance. Furthermore,
scheduling will likely involve self-referential constraints and appropriate optimization tools may not be available. The
LSST project faces these issues, and has designed and implemented an approach to performance analysis in its
Operations Simulator and associated post-processing packages. The Simulator has allowed the project to present detailed
performance predictions with a strong basis from the engineering design and measured site conditions. At present, the
Simulator is in regular use for engineering studies and science evaluation, and planning is underway for evolution to an
operations scheduling tool. We will describe the LSST experience, emphasizing the objectives, the accomplishments and
the lessons learned.
We present an innovative method for photometric calibration of massive survey data that will be applied to the
Large Synoptic Survey Telescope (LSST). LSST will be a wide-field ground-based system designed to obtain
imaging data in six broad photometric bands (ugrizy, 320-1050 nm). Each sky position will be observed multiple
times, with about a hundred or more observations per band collected over the main survey area (20,000 sq.deg.)
during the anticipated 10 years of operations. Photometric zeropoints are required to be stable in time to 0.5%
(rms), and uniform across the survey area to better than 1% (rms). The large number of measurements of
each object taken during the survey allows identification of isolated non-variable sources, and forms the basis
for LSST's global self-calibration method. Inspired by SDSS's uber-calibration procedure, the self-calibration
determines zeropoints by requiring that repeated measurements of non-variable stars must be self-consistent when
corrected for variations in atmospheric and instrumental bandpass shapes. This requirement constrains both the
instrument throughput and atmospheric extinction. The atmospheric and instrumental bandpass shapes will
be explicitly measured using auxiliary instrumentation. We describe the algorithm used, with special emphasis
both on the challenges of controlling systematic errors, and how such an approach interacts with the design of
the survey, and discuss ongoing simulations of its performance.
The Large Synoptic Survey Telescope (LSST) will continuously image the entire sky visible from Cerro Pachon
in northern Chile every 3-4 nights throughout the year. The LSST will provide data for a broad range of science
investigations that require better than 1% photometric precision across the sky (repeatability and uniformity)
and a similar accuracy of measured broadband color. The fast and persistent cadence of the LSST survey
will significantly improve the temporal sampling rate with which celestial events and motions are tracked. To
achieve these goals, and to optimally utilize the observing calendar, it will be necessary to obtain excellent
photometric calibration of data taken over a wide range of observing conditions - even those not normally
considered "photometric". To achieve this it will be necessary to routinely and accurately measure the full
optical passband that includes the atmosphere as well as the instrumental telescope and camera system. The
LSST mountain facility will include a new monochromatic dome illumination projector system to measure the
detailed wavelength dependence of the instrumental passband for each channel in the system. The facility will
also include an auxiliary spectroscopic telescope dedicated to measurement of atmospheric transparency at all
locations in the sky during LSST observing. In this paper, we describe these systems and present laboratory
and observational data that illustrate their performance.
The Large Synoptic Survey Telescope (LSST) flat-fields must repeatedly trace not only the spatial response variations,
but also the chromatic response through the entire optical system, with an accuracy driven by the photometric
requirements for the LSST survey data. This places challenging requirements on the LSST Calibration Dome Screen,
which must uniformly illuminate the 8.4-meter diameter telescope pupil over its 3.5-degree field of view at desired
monochromatic wavelengths in a way that allows the measurement of the total system throughput from entrance pupil to
the digitization of charge in the camera electronics. This includes the reflectivity of the mirrors, transmission of the
refractive optics and filters, the quantum efficiency of the sensors in the camera, and the gain and linearity of the sensor
read-out electronics. The baseline design uses a single tunable laser and includes an array of discrete projectors. The
projected flux of light produced by the screen must fill the entire telescope pupil and provide uniform illumination to 1%
at the focal plane and to within 0.25% over any optical trajectory within 0.5 degrees of each other. The wavelength of
light is tunable across the LSST bandpass from 320 nm to 1080 nm. The screen also includes a broad-band ("white")
light source with known Spectral Energy Density (SED) that spans the same range of wavelengths.
Science studies made by the Large Synoptic Survey Telescope will reach systematic limits in nearly all cases. Requirements for accurate photometric measurements are particularly challenging. Advantage will be taken of the rapid cadence and pace of the LSST survey to use celestial sources to monitor stability and uniformity of photometric data. A new technique using a tunable laser is being developed to calibrate the wavelength dependence of the total telescope and camera system throughput. Spectroscopic measurements of atmospheric extinction and emission will be made continuously to allow the broad-band optical flux observed in the instrument to be corrected to flux at the top of the atmosphere. Calibrations with celestial sources will be compared to instrumental and atmospheric calibrations.
The project for the proposed Large Synoptic Survey Telescope (LSST) performed more than two years of data
collection, site evaluation, and analysis to support the selection of its prime site. LSST assessment was based on
using an existing site with existing infrastructure and historical performance information. A large and diverse set of
comparative information was compiled for potential sites using results from other site campaigns, measurements
from existing large telescopes, new astro-climate measurements, logistical and feasibility information, and from
existing satellite and climate databases. Several analyses were performed on these data including the assessment of
survey performance using the LSST operation simulator. An independent site selection committee of experts
provided recommendations to the Project leading to three finalist sites, one in Mexico, and two in northern Chile.
The finalist sites were assessed thoroughly with additional data collection from all-sky cameras and site proposals.
Cerro Pachon in Chile was selected to be the site for LSST after a difficult decision between the high quality final
candidates. This paper describes the data, analysis and approach used to support the site evaluation.
The WIYN One Degree Imager (ODI) will be a well-sampled (0.11” per pixel) imager that provides a full one degree square field of view (32K×32K pixels). ODI will utilize high resistivity, red sensitive, orthogonal transfer (OT) CCDs to provide rapid correction for image motion arising from telescope shake, guider errors, and atmospheric effects. ODI will correct the full field of view by deploying 64 array packages having a total of 4096 independently controllable OTCCDs that can correct individually for local (2 arcmin) image motion. Each array package is an orthogonal transfer array (OTA) of 64 CCDs arranged in an 8×8 grid. Each CCD has 512×512 pixels. We expect the median image quality at the WIYN 3.5m telescope in RIZ to be 0.52”, 0.43”, and 0.35” FWHM. ODI makes optimal use of the WIYN telescope, which has superb optics, excellent seeing characteristics, a natural 1.4 degree field of view (with a new corrector), and can serve as a pathfinder for LSST in terms of detectors, data pipelines, operations strategies, and scientific motivation.
The Large-aperture Synoptic Survey Telescope will repeatedly image a large fraction of the visible sky in multiple optical passbands in a way that will sample temporal phenomena over a large range of time scales. This will enable a suite of synoptic investigations that range in temporal sampling requirements from the detection of near Earth asteroids (minutes), through discovery and followup of supernovae to long period monitoring of QSOs, AGN and LPVs (years). Additionally, the data must be obtained in a way to support programs aimed at building up deep static images of part or all of the sky.
Here we examine some of the issues involved in crafting an observing scheme that serves these goals. The problem has several parts: a) what is the optimal time sampling strategy that best serves the desired temporal range? b) how can a chosen time sampling sequence be packed into an observing scheme that accommodates all pointings and 'whiteout' windows (daytime, lunation period)? c) how vulnerable is such an observing plan to realistic models of disruption by poor observing conditions and weather? d) how does one build in the most economical contingency/redundancy to i) mitigate against such disruption and ii) reserve time for recovery and followup of transient phenomena (e.g. gamma-ray bursts, supernovae)?
In this article we touch upon several of these issues, and come to an understanding of some of the limitations, as well as areas in which scientific priorities and trade-offs will have to be made.
The goal of much recent engineering improvements at the 3.5m WIYN telescope has been to improve imaging performance that utilizes the good intrinsic seeing at Kitt Peak. This direction complements the efforts of high order adaptive optics by maximizing the usable field. The new 'mini-mosaic' camera, which is a mosaic of 2 4K by 2K SITE CCDs is in the final stages of commissioning. With its 0.14 arc-sec per pixel scale at the Nasmyth f/6.3 focus, it is capable of adequately sampling the best delivered images from the telescope, while maintaining a relatively large field of view. We present some early performance results from this new instruments, and demonstrate the excellent image quality over the entire 9.6-minute field.
We discuss many enhancements and refinements to the WIYN Telescope that have demonstrably improved the imaging performance over the past few years. Currently, WIYN yields a median delivered image quality of 0.8 arc-seconds for 10 second exposures in the R-Band, and images better than 0.6 arc-seconds 20% of the time Peak yielding excellent atmospheric seeing conditions. The telescope and enclosure were designed to exploit the good seeing conditions with features such as superb optical components, a primary mirror that is actively controlled for low-order aberrations, a thermally controlled primary mirror, active and passive enclosure ventilation, real-time focus and collimation control, and precise tracking and guiding control. The WIYN organization is committed to continuously enhance the scientific performance of the observatory and through a long-term commitment of increased technical support we have made significant progress on many technical aspects for improved imaging performance. These improvements, which are described in this paper, include optimization of the wavefront curvature measurement process, better control of secondary tilt and piston (focus), feedback mechanisms for structural thermo-mechanical effects, careful timing of the active thermal control for the primary mirror, damping of structural vibrations, and the implementation of a closed-loop focus sensor. WIYN is also developing an adaptive optics tip-tilt system which is described elsewhere in this conference.
In June 1997, NASA made the decision to extend the end of the Hubble Space Telescope (HST) mission from 2005 until 2010. As a result, the age of the instruments on board the HST became a consideration. After careful study, NASA decided to ensure the imaging capabilities of the HST by replacing the Wide Field Planetary Camera 2 with a low-cost facility instrument, the Wide Field Camera 3. This paper provides an overview of the scientific goals and capabilities of the instrument.
For the last 3 years, most of NOAO's 40 percent observing share on the WIYN 3.5 m telescope has been used for queued observing, with the goal of facilitating highly ranked science proposals that require rare observing conditions and/or synoptic or 'target of opportunity' observations. The ease of switching between imaging on one Nasmyth focus and multi- object fiber fed bench spectroscopy on the other Nasmyth port offers the choice of making the best use of the extant observing conditions. We assess the results of this experiment and highlight some of the forefront observing programs that have been executed. We discuss algorithms that facilitate making decisions on both long and short time scales so that we can provide the best match of program requirements and observing conditions. We suggest a way of quantifying the prioritization of programs beyond simple ranking that will greatly aid decision making, and evolve the procedures to where queued observations better serve the emphases placed by the time allocation process, without compromising the intent of the scientific investigators.
In July of 1998 the National Optical Astronomy Observatories (NOAO) successfully upgraded MOSAIC 1, an 8192 by 8192 pixel array using eight Scientific Imaging Technologies, Inc. (SITe) St-002A thinned backside 2k by 4k charge coupled devices (CCDs). In July of 1999 MOSAIC II, a clone of MOSAIC I was commissioned also using eight SITe ST-002A CCDs. Additionally in December of 1998 NOAO implemented Mini- MOSAIC a 4096 by 4096 pixel array using two SITe ST-002A thinned CCDs. This report will discuss the performance, characterization and capabilities of the three wide field imagers now in operation at NOAO's Kitt Peak Observatory, Cerro Tololo Inter-American Observatory and at the WIYN Consortium 3.5-Meter telescope on Kitt Peak.
During the past two years NOAO has conducted a queue observing experiment with the 3.5m WIYN telescope on Kitt Peak, Arizona. The WIYN telescope is ideally suited to queue-scheduled operation in terms of its performance and its instrument complement. The queue scheduling experiment on WIYN was designed to test a number of beliefs and hypotheses about gains in efficiency and scientific effectiveness due to queue scheduling. In addition, the experiment was a test of our implementation strategy and management of community expectations. The queue is run according to a set of rules that guide decisions about which observation to do next. In practice, scientific rank, suitability of current conditions, and the desire to complete programs all enter into these decisions. As predicted by Monte Carlo simulations, the queue increases the overall efficiency of the telescope, particularly for observations requiring rare conditions. Together with this improvement for typical programs, the queue enables synoptic, target-of-opportunity, and short programs that could not be scheduled classically. Despite this success, a number of sociological issues determine the community's perception of the WIYN queue.