High-cadence imaging is required in several astronomical scenarios. These include: studies of rapidly varying
sources, achieving maximum signal-to-noise observations of bright objects, and high dynamic range scenarios,
such as faint objects embedded in a crowded field of bright objects. Conventional CCDs have drawbacks in this
regime, because many short exposures are needed (either for timeseries sampling, or simply to avoid saturation),
with extended readout times between exposures. Consequently, the duty cycle (ratio of exposure time to readout
time) dramatically worsens as exposures get shorter. However, Low Light Level CCDs (L3-CCDs) offer low
readout noise, high readout rates, and 100% duty cycle. Coupled with its fast frame-transfer mechanism (~2ms
to shift the image to the storage area), an L3-CCD can sustain essentially continuous open-shutter time. Our
models demonstrate that for a fixed observing time, the L3-CCD will deliver a better signal to noise performance
in the high-cadence imaging regime when compared to similar CCDs, even when the latter's performance is
optimised by windowing and binning. We also demonstrate that the improved duty cycle reduces the photometric
impact of atmospheric scintillation, for any given aperture of telescope. We outline the integration of an L3-CCD
into our camera system for high cadence imaging.
Abundance variations of carbon and nitrogen in globular star clusters provide astronomers with a means to determine a cluster's evolutionary past. Moreover, these clusters are so ancient (~13 billion years) and so well preserved that they provide an ideal diagnostic for the overall chemical history of the Milky Way Galaxy.
Traditionally, spectroscopy is the preferred method to perform investigations into such theories. However, it is not without its drawbacks: spectroscopy can normally only be obtained star by star, and both large telescopes and a great deal of time is required to carry out research in this manner. As globular clusters are known to
contain up to a million stars, studying each star individually would take too much time to return a true representative sample of the cluster stars. So, we opt instead for a spectrophotometric technique and a statistical approach to infer a cluster's composition variations. This has required the design and use of new custom narrow-band filters centered on the CH and CN molecular absorption bands or their adjacent continua. Two Galactic clusters (M71 & M92) with contrasting characteristics have been chosen for this study. In order to process this data a header-driven (i.e. automated) astronomical data-processing pipeline was developed for use with a
family of CCD instruments known as the FOSCs. The advent of CCD detectors has allowed astronomers to generate large quantities of raw data on a nightly basis, but processing of this amount of data is extremely time and resource intensive. In our case the majority of our cluster data has been obtained using the BFOSC instrument on the 1.52m Cassini Telescope at Loiano, Italy. However, as there are a number of these FOSC instruments throughout the world, our pipeline can be easily adapted to suit any of them. The pipeline has been tested using various types of data ranging from brown dwarf stars to globular cluster images, with each new dataset providing us with new problems/bugs to solve and overcome. The pipeline performs various tasks such as data reduction including image de-fringing, image registration and photometry, with final products consisting of RGB colour images and colour magnitude diagrams (CMD).
We describe how instrument data-processing pipelines can be quickly and easily developed using the modularity, header-manipulation, and scripting features of the IRAF suite. Our illustration case is the design of a simple IRAF-based reduction and analysis pipeline for the BFOSC instrument on the 1.52m Cassini Telescope at Loiano, run by the Osservatorio Astronomico di Bologna. On the basis of header keywords, raw frames are automatically processed in a series of steps: grouping by any observational parameter(s), CCD reduction, registration, coaddition, photometry, deconvolution, RGB-tricolour representation, and basic astrometry, with spectroscopy partially implemented as of now. In this way, FITS data can be automatically analysed from raw frames to "end product" (of final or near-final scientific quality), while still "at the telescope", thus enabling much faster feedback. Since the xFOSC family of instruments produced by the Astronomical Observatory of Copenhagen, which includes BFOSC, share identical design and operation, it should be simple to adapt the pipeline to any of the ten FOSC instruments: DFOSC on the ESO/Danish 1.54m, ALFOSC on the Nordic Optical Telescope, TFOSC on the new TT1 (Castelgrande) Telescope. We also aim to make it available for "on the fly" archival processing.
To take advantage of the recent upsurge in astrophysical research applications of grid technologies coupled with the increase in temporal and spatial coverage afforded to us by dedicated all-sky surveys and on-line data archives, we have developed an automated image reduction and analysis pipeline for a number of different astronomical instruments. The primary science goal of the project is in the study of long-term optical variability of brown dwarfs, although it can be tailored to suit many varied astrophysical phenomena. The pipeline complements Querator, the custom search-engine which accesses the astronomical image archives based at the ST-ECF/ESO centre in Garching, Germany. To increase our dataset we complement the reduction and analysis of WFI (Wide Field Imager, mounted on the 2.2-m MPG/ESO telescope at La Silla) archival images with the analysis of pre-reduced co-spatial HST/WFPC2 images and near infrared images from the DENIS archive. Our pipeline includes CCD-image reduction, registration, astrometry, photometry, and image matching stages. We present sample results of all stages of the pipeline and describe how we overcome such problems as missing or incorrect image meta-data, interference fringing, poor image calibration files etc. The pipeline was written using tasks contained in the IRAF environment, linked together with Unix Shell Scripts and Perl, and the image reduction and analysis is performed using a 40-processor Origin SGI 3800 based at NUI, Galway.
As the astronomical community continues to produce deeper and higher resolution data, it becomes increasingly important to provide tools to the scientist that help mining the data in order to provide only the scientifically interesting images. In the case of uncalibrated archives, this task is especially difficult as it is difficult to know whether an interesting source can be seen on images without actually looking. Here, we show how instrument simulation can be used to lightly process the database-stored image descriptors of the ESO/Wide Field Imager (WFI) archive, and compute the corresponding limiting magnitudes. The end result is a more scientific description of the ESO/ST-ECF archive contents, allowing a more astronomer-friendly archive user interface, and hence increasing the archive useability in the context of a Virtual Observatory. This method is developed for improving the Querator search engine of ESO/HST archive, in the context of the EC funded ASTROVIRTEL project, but also provides an independant tool that can be adapted to other archives.
A new modular high time resolution imaging camera system with sub-microsecond timing accuracy has been built in the Physics Dept. of NUI, Galway. The system was designed to be mounted on large telescopes for observing the temporal, spectral and polarisation characteristics of faint astronomical objects, such as optical pulsars. The camera system developed allows simultaneous and independent observing of multiple wavebands of emission from the target objects. This is achieved using optics that split images into their different spectral or polarisation components. The system currently incorporates a multi-anode microchannel array (MAMA) photon detecting and imaging camera with a time resolution of up to 100ns. This is combined with three high quantum efficiency avalanche photodiodes (APDs) with count rates of up to 16 million photons per second. The high time resolution recording system can allow for the removal of telescope tracking inaccuracy and wind shear off-line. This yields better PSFs for bright objects such as crowded globular star clusters. This combination of different detectors allows the system to be operated as a multi purpose, high QE, high time resolution system. The modular nature of the design electronics also allows the addition and removal of detectors without limiting the performance of other elements within the system. The data path is also designed so that archiving integrity is maintained while the data path is simultaneously used for real-time analysis and display systems. Future applications in the bio-medical imaging sector are envisaged for high time resolution fluorescence imaging, and astronomical polarisation studies.
The EGRET gamma-ray telescope has left a legacy of unidentified astronomical sources. Most likely, many of the galactic plane sources will be rotation-powered pulsars. Firm identification has been difficult, given the instrument's poor spatial resolution. The problem is exacerbated by the energy dependant Point Spread Function (PSF) and low numbers of source counts. The main method of identifying sources to-date has been a maximum likelihood method. We have taken a different approach, namely that of regularized deconvolution with a spatially invariant PSF, which is used in optical astronomy and medical X-ray imaging. This technique revealed that wavelet denoising of residuals produced smooth, relatively artefact-free images with improved spatial location. Our source location using standard centroiding produced an improvement in relative spatial location, ranging from 10:1 to 2:1 proportional to source strength. Wavelet deconvolution simultaneously achieves background smoothing, while improving sharpness of the resolved objects. The photon-sparse nature of these images makes them an ideal test bed for such techniques. Although deconvolution does not ordinarily conserve flux, in this instance the flux determination is unaffected in all but the most crowded regions. Finally, we show that the energy dependent PSF can be used to identify objects with a restricted range of energy spectra.
There is a family of difficult image-processing scenarios which
involve seeking out and quantifying minute changes within a sequence
of near-identical images. Traditionally these have been dealt with by
carefully registering the images in terms of position, orientiation
and intensity, and subtracting them from some template image. However, for critical measurements, this approach breaks down if the
point-spread-functions (PSFs) vary even slightly from image to
image. Subtraction of registered images whose PSFs are not matched
leads to considerable residual structure, which may be mistakenly
interpreted as real features rather than processing artefacts. In
astronomy, software known as ISIS has been developed to
fully PSF-match image sequences and to facilitate their analysis. We
show here the tremendous improvement in detection rates and
measurement accuracy which ISIS has afforded in our program for the
study of rare variable stars in dense, globular star clusters. We
discuss the genesis from this work of our new program to use ISIS to
search for extra-solar planets in transit across the face of stars in
such clusters. Finally we illustrate an application of ISIS in the
industrial imaging sector, showing how it can be used to detect minute faults in images of products.