The Large Synoptic Survey Telescope (LSST) is an 8.4m optical survey telescope being constructed on Cerro Pach´on in Chile. The data management system being developed must be able to process the nightly alert data, 20,000 expected transient alerts per minute, in near real time, and construct annual data releases at the petabyte scale. The development team consists of more than 90 people working in six different sites across the US developing an integrated set of software to realize the LSST science goals. In this paper we discuss our agile software development methodology and our API and developer decision making process. We also discuss the software tools that we use for continuous integration and deployment.
Agile methodologies are current best practice in software development. They are favored for, among other reasons, preventing premature optimization by taking a somewhat short-term focus, and allowing frequent replans/reprioritizations of upcoming development work based on recent results and current backlog. At the same time, funding agencies prescribe earned value management accounting for large projects which, these days, inevitably include substantial software components. Earned Value approaches emphasize a more comprehensive and typically longer-range plan, and tend to characterize frequent replans and reprioritizations as indicative of problems. Here we describe the planning, execution and reporting framework used by the LSST Data Management team, that navigates these opposite tensions.
The James Clerk Maxwell Telescope (JCMT) is the largest single-dish submillimetre telescope in the world, and throughout its lifetime the volume and impact of its science output have steadily increased. A key factor for this continuing productivity is an ever-evolving approach to optimising operations, data acquisition, and science product pipelines and archives. The JCMT was one of the first common-user telescopes to adopt flexible scheduling in 2003, and its impact over a decade of observing will be presented. The introduction of an advanced data-reduction pipeline played an integral role, both for fast real-time reduction during observing, and for science-grade reduction in support of individual projects, legacy surveys, and the JCMT Science Archive. More recently, these foundations have facilitated the commencement of remote observing in addition to traditional on-site operations to further increase on-sky science time. The contribution of highly-trained and engaged operators, support and technical staff to efficient operations will be described. The long-term returns of this evolution are presented here, noting they were achieved in face of external pressures for leaner operating budgets and reduced staffing levels. In an era when visiting observers are being phased out of many observatories, we argue that maintaining a critical level of observer participation is vital to improving and maintaining scientific productivity and facility longevity.
The NOAO Data Lab will allow users to efficiently utilize catalogs of billions of objects, augment traditional telescope imaging and spectral data with external archive holdings, publish high level data products of their research, share custom results with collaborators and experiment with analysis toolkits. The goal of the Data Lab is to provide a common framework and workspace for science collaborations and individuals to use and disseminate data from large surveys.
In this paper we describe the motivations behind the NOAO Data Lab and present a conceptual overview of the activities we plan to support. Specific science cases will be used to develop a prototype framework and tools, allowing us to work directly with scientists from survey teams to ensure development will remain focused on scientifically productive tasks. This will additionally develop a pool of both scientific and technical experts who can provide ongoing advice and support for community users as the scope and capabilities of the Data Lab expand.
The JCMT Science Archive is a collaboration between the James Clerk Maxwell Telescope and the Canadian Astronomy Data Centre to provide access to raw and reduced data from SCUBA-2 and the telescope’s heterodyne instruments. It was designed to include a range of advanced data products, created either by external groups, such as the JCMT Legacy Survey teams, or by the JCMT staff at the Joint Astronomy Centre. We are currently developing the archive to include a set of advanced data products which combine all of the publicly available data. We have developed a sky tiling scheme based on HEALPix tiles to allow us to construct co-added maps and data cubes on a well-defined grid. There will also be source catalogs both of regions of extended emission and the compact sources detected within these regions.
SCUBA-2 is an innovative 10,000 pixel submillimeter camera due to be delivered to the James Clerk Maxwell Telescope in late 2006. The camera is expected to revolutionize submillimeter astronomy in terms of the ability to carry out wide-field surveys to unprecedented depths addressing key questions relating to the origins of galaxies, stars and planets. This paper presents an update on the project with particular emphasis on the laboratory commissioning of the instrument. The assembly and integration will be described as well as the measured thermal performance of the instrument. A summary of the performance results will be presented from the TES bolometer arrays, which come complete with in-focal plane SQUID amplifiers and multiplexed readouts, and are cooled to 100mK by a liquid cryogen-free dilution refrigerator. Considerable emphasis has also been placed on the operating modes of the instrument and the "common-user" aspect of the user interface and data reduction pipeline. These areas will also be described in the paper.
In the last few years the ubiquitous availability of high bandwidth networks has changed the way both robotic and non-robotic telescopes operate, with single isolated telescopes being integrated into expanding "smart" telescope networks that can span continents and respond to transient events in seconds. The Heterogeneous Telescope Networks (HTN)* Consortium represents a number of major research groups in the field of robotic telescopes, and together we are proposing a standards based approach to providing interoperability between the existing proprietary telescope networks. We further propose standards for interoperability, and integration with, the emerging Virtual Observatory.
We present the results of the first interoperability meeting held last year and discuss the protocol and transport standards agreed at the meeting, which deals with the complex issue of how to optimally schedule observations on geographically distributed resources. We discuss a free market approach to this scheduling problem, which must initially be based on ad-hoc agreements between the participants in the network, but which may eventually expand into a electronic market for the exchange of telescope time.
Linking ground based telescopes with astronomical satellites, and using the emerging field of intelligent agent architectures to provide crucial autonomous decision making in software, we have combined data archives and research class robotic telescopes along with distributed computing nodes to build an ad-hoc peer-to-peer heterogeneous network of resources. The eSTAR Project* uses intelligent agent technologies to carry out resource discovery, submit observation requests and analyze the reduced data returned from a meta-network of robotic telescopes. We present the current operations paradigm of the eSTAR network and describe the direction of in which the project intends to develop over the next several years. We also discuss the challenges facing the project, including the very real sociological one of user acceptance.
The Joint Astronomy Centre operates two telescopes at the Mauna Kea Observatory: the James Clerk Maxwell Telescope, operating in the submillimetre, and the United Kingdom Infrared Telescope, operating in the near and thermal infrared. Both wavelength regimes benefit from the ability to schedule observations flexibly according to observing conditions, albeit via somewhat different "site quality" criteria. Both UKIRT and JCMT now operate completely flexible schedules. These operations are based on telescope hardware which can quickly switch between observing modes, and on a comprehensive suite of software (ORAC/OMP) which handles observing preparation by remote PIs, observation submission into the summit database, conditions-based programme selection at the summit, pipeline data reduction for all observing modes, and instant data quality feedback to the PI who may or may not be remote from the telescope. This paper describes the flexible scheduling model and presents science statistics for the first complete year of UKIRT and JCMT observing under the combined system.
The eSTAR Project uses intelligent agent technologies to carry out resource discovery, submit observation requests and analyze the reduced data returned from a network of robotic telescopes in an observational grid. The agents are capable of data mining and cross-correlation tasks using on-line catalogues and databases and, if necessary, requesting additional data and follow-up observations from the telescopes on the network. We discuss how the maturing agent technologies can be used both to provide rapid followup to time critical events, and for long term monitoring of known sources, utilising the available resources in an intelligent manner.
The JCMT, the world's largest sub-mm telescope, has had essentially the same VAX/VMS based control system since it was commissioned. For the next generation of instrumentation we are implementing a new Unix/VxWorks based system, based on the successful ORAC system that was recently released on UKIRT.
The system is now entering the integration and testing phase. This paper gives a broad overview of the system architecture and includes some discussion on the choices made. (Other papers in this conference cover some areas in more detail). The basic philosophy is to control the sub-systems with a small and simple set of commands, but passing detailed XML configuration descriptions along with the commands to give the flexibility required. The XML files can be passed between various layers in the system without interpretation, and so simplify the design enormously. This has all been made possible by the adoption of an Observation Preparation Tool, which essentially serves as an intelligent XML editor.
UKIRT and JCMT, two highly heterogeneous telescopes, have been embarking on several joint software projects covering all areas of observatory operations such as observation preparation and scheduling, telescope control and data reduction. In this paper we briefly explain the processes by which we have arrived at such a large body of shared code and discuss our experience with developing telescope-portable software and code re-use.
The steady improvement in telescope performance at UKIRT and the increase in data acquisition rates led to a strong desired for an integrated observing framework that would meet the needs of future instrumentation, as well as providing some support for existing instrumentation. Thus the Observatory Reduction and Acquisition Control (ORAC) project was created in 1997 with the goals of improving the scientific productivity in the telescope, reducing the overall ongoing support requirements, and eventually supporting the use of more flexibly scheduled observing. The project was also expected to achieve this within a tight resource allocation. In October 1999 the ORAC system was commissioned at the United Kingdom Infrared Telescope.
For the past seven years observing with the major instruments at the United Kingdom IR Telescope (UKIRT) has been semi-automated, using ASCII files top configure the instruments and then sequence a series of exposures and telescope movements to acquire the data. For one instrument automatic data reduction completes the cycle. The emergence of recent software technologies has suggested an evolution of this successful system to provide a friendlier and more powerful interface to observing at UKIRT. The Observatory Reduction and Acquisition Control (ORAC) project is now underway to construct this system. A key aim of ORAC is to allow a more complete description of the observing program, including the target sources and the recipe that will be used to provide on-line data reduction. Remote observation preparation and submission will also be supported. In parallel the observatory control system will be upgraded to use these descriptions for more automatic observing, while retaining the 'classical' interactive observing mode. The final component of the project is an improved automatic data reduction system, allowing on-line reduction of data at the telescope while retaining the flexibility to cope with changing observing techniques and instruments. The user will also automatically be provided with the scripts used for the real-time reduction to help provide post-observing data reduction support. The overall project goal is to improve the scientific productivity of the telescope, but it should also reduce the overall ongoing support requirements, and has the eventual goal of supporting the use of queue- scheduled observing.