The Virtual Observatory (VO) is realizing global electronic integration of astronomy data. One of the long-term goals of
the U.S. VO project, the Virtual Astronomical Observatory (VAO), is development of services and protocols that
respond to the growing size and complexity of astronomy data sets. This paper describes how VAO staff are active in
such development efforts, especially in innovative strategies and techniques that recognize the limited operating budgets
likely available to astronomers even as demand increases. The project has a program of professional outreach whereby
new services and protocols are evaluated.
The U.S. Virtual Astronomical Observatory (VAO http://www.us-vao.org/) has been in operation since May 2010. Its goal is to enable new science through efficient integration of distributed multi-wavelength data. This paper describes the management and organization of the VAO, and emphasizes the techniques used to ensure efficiency in a distributed organization. Management methods include using an annual program plan as the basis for establishing contracts with member organizations, regular communication, and monitoring of processes.
Operation of the US Virtual Astronomical Observatory shares some issues with modern physical observatories, e.g.,
intimidating data volumes and rapid technological change, and must also address unique concerns like the lack of direct
control of the underlying and scattered data resources, and the distributed nature of the observatory itself. In this paper
we discuss how the VAO has addressed these challenges to provide the astronomical community with a coherent set of
science-enabling tools and services. The distributed nature of our virtual observatory-with data and personnel
spanning geographic, institutional and regime boundaries-is simultaneously a major operational headache and the
primary science motivation for the VAO. Most astronomy today uses data from many resources. Facilitation of
matching heterogeneous datasets is a fundamental reason for the virtual observatory. Key aspects of our approach
include continuous monitoring and validation of VAO and VO services and the datasets provided by the community,
monitoring of user requests to optimize access, caching for large datasets, and providing distributed storage services that
allow user to collect results near large data repositories. Some elements are now fully implemented, while others are
planned for subsequent years. The distributed nature of the VAO requires careful attention to what can be a
straightforward operation at a conventional observatory, e.g., the organization of the web site or the collection and
combined analysis of logs. Many of these strategies use and extend protocols developed by the international virtual
observatory community. Our long-term challenge is working with the underlying data providers to ensure high quality
implementation of VO data access protocols (new and better 'telescopes'), assisting astronomical developers to build
robust integrating tools (new 'instruments'), and coordinating with the research community to maximize the science
Broad support for Virtual Observatory (VO) standards by astronomical archives is critical for the success of the
VO as a research platform. Indeed, a number of effective data discovery, visualization, and integration tools
have been created which rely on this broad support. Thus, to an archive, the motivation for supporting VO
standards is strong. However, we are now seeing a growing trend among archive developers towards leveraging
VO standards and technologies not just to provide interoperability with the VO, but also to support an archive's
internal needs and the needs of the archive's primary user base. We examine the motivation for choosing VO
technologies for implementing an archive's functionality and list several current examples, including from the
Hubble Legacy Archive, NASA HEASARC, NOAO, and NRAO. We will also speculate on the effect that VO
will have on some of the ambitious observatory projects planned for the near future.
The International Virtual Observatory Alliance (IVOA: http://www.ivoa.net) represents 14 international projects working in coordination to realize the essential technologies and interoperability standards necessary to create a new research infrastructure for 21st century astronomy. This international Virtual Observatory will allow astronomers to interrogate multiple data centres in a seamless and transparent way, will provide new powerful analysis and visualisation tools within that system, and will give data centres a standard framework for publishing and delivering services using their data. The first step for the IVOA projects is to develop the standardised framework that will allow such creative diversity. Since its inception in June 2002, the IVOA has already fostered the creation of a new international and widely accepted, astronomical data format (VOTable) and has set up technical working groups devoted to defining essential standards for service registries, content description, data access, data models and query languages following developments in the grid community. These new standards and technologies are being used to build science prototypes, demonstrations, and applications, many of which have been shown in international meetings in the past two years. This paper reviews the current status of IVOA projects, the priority areas for technical development, the science prototypes and planned developments.
The U.S. National Science Foundation is sponsoring the development of the infrastructure for the National Virtual Observatory via its Information Technology Research Program. This initiative combines expertise from astronomical observatories, data archive centers, university astronomy departments, and computer science and information technology groups at seventeen different organizations. This paper describes the nature of the project, our approach to managing and coordinating work across such a large collaboration, and the progress made thus far in the initial development activities (metadata standards, systems architecture, and science requirements definition).
Building an automated classifier for high-energy sources provides an opportunity to prototype approaches to building the Virtual Observatory with a substantial immediate scientific return. The ClassX collaboration is combining existing data resources with trainable classifiers to build a tool that classifies lists of objects presented to it. In our first year the collaboration has concentrated on developing pipeline software that finds and combines information of interest and in exploring the issues that will be needed for successful classification.
ClassX must deal with many key VO issues: automating access to remote data resources, combining heterogeneous data and dealing with large data volumes. While the VO must attempt to deal with these problems in a generic way, the clear science goals of ClassX allow us to act as a pathfinder exploring particular approaches to addressing these issues.
The goal of the virtual observatory is to provide seamless, efficient access to astronomical data archives, catalogs, bibliographic services, and computational resources worldwide. This goal can be realized only through the development of a sophisticated information technology infrastructure. The infrastructure must accommodate the integration of diverse data types from potentially thousands of sites and services, capitalizing on emerging computational grid technologies for resource allocation and management. This paper describes the major IT challenges facing the virtual observatory and suggests a middleware architecture capable of supporting its scientific objectives.
The Hubble Data Archive at the Space Telescope Science Institute contains over 4.3 TB of data, primarily for the Hubble Space Telescope, but also from complementary space- based and ground-based facilities. We are in the process of upgrading and generalizing many of the HDA's component system, developing tools to provide more integrated access to the HDA holdings, and working with other major data providing organizations to implement global data location services for astronomy and other space science disciplines. This paper describes the key elements of our archiving and data distribution systems, including a planned transition to DVD media, data compression, data segregation, on-the-fly calibration, an engineering data warehouse, and distributed search and retrieval facilities.
The Hubble Space Telescope image quality was degraded significantly as a result of the spherical aberration in the primary mirror. Repairs made during the first servicing mission have restored HST's optical performance to the original design specifications. This paper examines not only how effective image restoration has been for the data from the first three and a half years of the mission, but also demonstrates the applicability of the same deconvolution algorithms to postservicing mission images. A variety of well-known image restoration techniques, such as Wiener filters, Richardson-Lucy, Jansson-van Cittert, CLEAN, and maximum entropy, have been applied to HST imaging with reasonable success. However, all techniques have been hampered by incomplete knowledge of the point spread function and the space variance of the point spread function (PSF) in the Wide Field/Planetary Camera, HST's most frequently used imager. After the servicing mission, the COSTAR-corrected Faint Object Camera acquires a space-variant PSF. Restoration is further complicated by time variance of the PSF resulting from both long-term and short-term (orbital period) changes in focus, problems which are not likely to be ameliorated by the servicing mission. Thus, it will be important to continue the efforts in modeling the HST optical system and apply deconvolution methods in order to attain the best scientific results from the telescope.
The Hubble Space Telescope image quality is degraded significantly as a result of the spherical aberration in the primary mirror. Although the forthcoming HST servicing mission will deploy corrective optics systems and a second-generation camera with built-in correctors, the current and archival images for the past three years require the use of restoration techniques in order to achieve the full scientific potential of the HST mission. In addition, we expect that the restoration techniques now being developed will continue to be utilized on post- servicing mission imagery in order to remove the remaining diffraction features and optimize dynamic range. A variety of well-known image restoration techniques, such as Wiener filters, Richardson-Lucy, Jansson-van Cittert, CLEAN, and maximum entropy, have been applied to HST imaging with reasonable success. However, all techniques have been hampered by incomplete knowledge of the point spread function and the space variance of the PSF in the Wide Field/Planetary Camera, HST's most frequently used imager. The best restoration results to date have been obtained by utilizing observed PSFs (an optimal exposure which minimizes noise and image defects) on small enough subsets of the total field of view so that the PSF variation can be ignored.