There are now available a wide variety of electronic imaging techniques for producing images and spatially-resolved spectra in the far- and middle-ultraviolet wavelength ranges below 300 nm. These are basically similar to techniques used in the near-UV and visible. They include devices with electronic readout, and those whose final images are produced on film. Radiometric imaging, however, requires special consideration of the photometric qualities of the sensor and means for calibrating the images- that is, correlating some measure of image intensity to the intensity or integrated flux of input radiation. As is true in most other wavelength ranges, calibration techniques can be based on (a) standard light sources, whose intensities vs. wavelength are known, or (b) standard comparison detectors, whose detection efficiencies vs. wavelength are known. Our group at NRL has been developing ultraviolet cameras and spectrographs, for space science and for DoD applications, primarily for the far-ultraviolet (below 200 nm) range. These record images electrographically (on electron-sensitive film) or electronically (using electron-bombarded CCD arrays). Our calibration techniques are primarily based on standard detectors, including calibrated photodiodes and windowless gas photoionization chambers, although we have occasionally used the SURF facility at NBS as a standard light source.
The radiometric errors introduced by the optical components of a high performance imaging infrared radiometer have been quantified both theoretically and experimentally. The excellent agreement obtained has enabled an algorithm to be implemented within the system processing electronics to "correct" the apparent temperature of the displayed, internal temperature references for such radiometric errors. Cor-rections have been evaluated both for the scanner unit with a wide angle window and with an optional x20 telescope. The radiometer under discussion is the Dual Waveband Imaging Radiometer (DUWIR) which has proved to be a very successful data collection tool due to its unique combination of a high performance imager offering simultaneous 3-5μm and 8-12μm TV compatible outputs together with internal tempera-ture references to provide real-time temperature calibration. The prototype has been used extensively for IR signature collection over the past three years and production units are now in use in several countries.
This paper describes various methods for the calibration and error estimation of imaging radiometers. The methodologies are presented in the form of specific examples of the calibration of the Naval Research Laboratory's airborne infrared (IR) radiometric measurement system. Two imaging radiometers are part of this system and are used to perform radiometric measurements on a variety of targets and backgrounds over a spectral range of 3μm to 12μm. The calibration procedures for these radiometers are presented along with an extensive error analysis which examines in detail 11 different sources of error. Such an error analysis clearly illustrates the need for proper calibration procedures.
This paper presents a review of the technical characteristics of the Dual Color Radiometer and recent data and test results. The Dual Color Radiometer is a state-of-the-art device that provides simultaneous pixel to pixel registered thermal imagery in both the 3 to 5 and 8 to 12 micron regions. The device is unique in terms of its spatial and temperature resolution of less than 0.10°C temperature and 0.10 milliradian spatial resolution. In addition, the device is tailored for use by the Automatic Target Recognizer (ATR) community. This paper is a follow-up to the paper (New High Sensitivity, High Resolution Dual Color Calibrated Imaging Radiometer) published in SPIE Proceedings, Volume 940 dated April 1988. Whereas the previous paper dealt with design and special features, this paper will describe the laboratory tests and present the associated performance results for both the 3 to 5 and 8 to 12 micron spectral regions in the following areas: • Minimum resolvable temperature • Modulation transfer function • Video imagery
Oriel's new diode array radiometer consists of a flat focal plane spectrograph and a diode array detector, interfaced to a standard XT or AT type computer. The spectrograph has interchangeable gratings, and can be set for almost any wavelength range from 180 to 1100 nm. The diode array detector is thermally stabilized to 0.1°C, and can acquire an entire spectrum in only 5 milliseconds, it can take a sequence of scans, and can follow the time course of an optical change. The results can be graphically displayed in absolute energy units verses wavelength or time. Radiometric capability is achieved by using an NIST traceable light source to calibrate the system in situ for the wavelength region of interest. This takes into account front end fiber optic guides, optical filters, and spectrograph efficiency. Extensive software semi-automates this process, providing automatic calibration of the radiometer against the output energy curve of the source. The radiometer can exactly quantify exposure levels of UV-A and UV-B radiation. This is more accurate than the approximate values derived using optical filters. The system can also simulate other types of detectors defined by the user, and the data can be presented in a variety of photometric units. The diode array detector is 25 mm long and can be used for linear radiometric imaging when used with bandpass filters. A camera and assortment of lenses may be used in conjunction with the diode array detector for telephoto, wide-angle, or micro imaging.
A radiometer had been designed for precision coherent radiation measurements and tested for long-term repeatability at wavelengths of 488 and 633 nm. The radiometer consists of a single high-quality PN silicon photodiode maintained in a nitrogen atmosphere with a quartz window designed to eliminate interference problems. Ratio measurements between the radiometer and an absolute type detector were made over a period of 215 days. At 0.5 mW, the standard deviations were 0.008 and 0.009 % at 488 and 633 nm respectively. The maximum deviations from the mean were 0.016 and 0.015 % at the respective wavelengths. The high precision, simplicity, and portability of the radiometer make it an excellent transfer standard for radiometric measurements.
The information presented in this report describes an instrument which is used for precision measurements of detector spectral response and spatial response. Emphasis will be placed on detector spatial uniformity measurements. To allow spatial uniformity testing at selected wavelengths, an instrument was designed by applying existing spectral response instrumentation technology with the addition of special exit optics, a dual axis motorized positioning table, and supporting software. Supporting components consisted of a computer controlled radiometer and a monochromator with a high intensity light source attached. Spectral response is determined by measuring the wavelength response photosensitivity of a stationary specimen to the irradiance of a calibrated monochromatic light source over the wavelength range of interest at evenly spaced intervals. Data is presented in a pictorial format by graphing the RESPONSE versus the WAVELENGTH. Detector spatial response is determined by measuring the variation in photosensitivity over the surface of the test detector by moving the detector in an X,Y grid at evenly spaced intervals under a small monochromatic spot of light. Several versions of the instrument were built and test results are provided which represent data from the spatial uniformity testing of Ge, PbS, and PbSe detectors. Data acquired is presented as a 3-Dimensional surface map by plotting the RESPONSE versus the X POSITION versus the Y POSITION.
A new reference spectrophotometer is being developed at the National Research Council of Canada (NRCC) for high-accuracy transmittance measurements over the spectral range 200 to 2500 nm. This computerized instrument is a single-beam design based upon a servomotor-driven double monochromator with a wavelength resolution of 0.022 nm. The other main components are: (1) two interchangeable sources (deuterium and tungsten-halogen), (2) two computer-selectable TE-cooled detectors (GaAs photomultiplier tube and PbS cell), (3) all-reflective input and exit optics, and (4) a large sample compartment. The significant feature of the optical system is a large, well-collimated measurement beam: the angle of convergence is 0.7° for a slit height of 7 mm, and the maximal beam size is 37x20 mm. This beam geometry eliminates the need for polarization corrections (using linearly polarized light) or compensation for spatially non-uniform detector responsivity (using averaging sphere). The paper describes the instrument design and presents data from preliminary performance tests. The systematic and random sources of error that have been investigated include: wavelength accuracy and reproducibility, bandpass, beam uniformity, polarization, stray light, system drift and linearity. A new linearity tester has been developed for checking the photometric accuracy. This automated device is based upon the double-aperture method but takes advantage of high precision piezoelectric motors to give a single adjustable aperture. Transmittance measurements of several neutral-density glass filters at 546 nm demonstrate that the photometric precision is better than 0.01% of the measured value and that the photometric accuracy is a few parts in 104. The wavelength scale is accurate to better than ±0.1 nm from 300 to 2500 nm, and is reproducible within ±0.03 nm.
Two specially constructed integrating sphere calibration sources are described. The sources are being used for prelaunch test and calibration of two different imaging radiometers used as satellite-borne earth remote sensors. The calibration sources are primarily intended to serve as transfer standards of spectral radiance. Design criteria for each integrating sphere are presented. Included is a synopsis of the end-users' specifications regarding spectral radiance, incremental levels of attenuation, source aperture and sterance uniformity. Radiometric theory used to predict the resultant integrating sphere spectral radiance for a specific input flux is summarized. Characteristic design features for both sources are highlighted including a description of the DC power supply systems driving source operation. Calibration methods and instrumentation are described. Resultant data are presented for both sources including a comparison of predicted and measured spectral radiances, luminance uniformity mappings, and stability tests. Methods of data reduction and uncertainty analysis are addressed when applicable.
Techniques for improving the knowledge of the radiance of large area spherical and hemispherical integrating energy sources have been investigated. Such sources are used to calibrate numerous aircraft and spacecraft remote sensing instruments. Comparisons are made between using a standard source based calibration method and a quantum efficient detector (QED) based calibration method. The uncertainty involved in transferring the calibrated values of the point source standard lamp to the extended source is estimated to be 5 to 10%. The use of the QED allows an improvement in the uncertainty to 1 to 2% for the measurement of absolute radiance from a spherical integrator source.
The development of a computer-controlled system which enables testing both spectrally and goniometrically will be discussed. Built around a multi-purpose spectroradiometer system, the exit optics attachment allows users to test samples by varying both incident and measurement angles. The general spectroradiometer system and the individual attachment will be highlighted. Originally developed for measuring the reflectance of specular samples from 200nm-30μm, this instrument may also be used for other types of spectral measurements, including diffuse reflectance and transmittance. The above measurement areas will be described, incorporating test sample graphs and system performance data.
A class of highly lambertian, environmentally stable reflectance materials varying in reflectance from 2% to 99% have been produced. These materials offer advantages of hydrophobicity, chemical inertness, spectral flatness, and ease of maintenance over materials currently in use.
A new radiometer (BDR) has been developed, which discriminates small differences between an object and its surrounding background, and is able to measure an object's changing contrast when i) the contrast of a moving object is to be measured against a changing background. ii) the difference in radiant emittance of a small object against its background or of two objects with respect to each other and this difference is small compared to the emittance itself. Practical examples of such measurements are contrast measurements of airplanes and missiles in flight, contrast measurements of small, weak objects on a warm background and uniformity measurements of radiant emittance from an object's surface. Previous instruments were unable to make such measurements since i) The process of contrast measurement with a fixed field of view radiometer is too slow for implementation on flying objects; ii) detection of a small difference between two large DC signals is impossible in a traditional fixed field of view radiometer when the instrument itself is saturated. The newly developed IR Background Discrimination Radiometer (BDR) provides a suitable answer for these applications. Its key design concept is the following: the radiometer has two fields of view. These two FOV's are alternated in real time and, using lock-in amplification techniques, the difference between the two FOV signals is detected. This difference is proportional to the contrast between the two FOV's while the common signal from the two FOV's is rejected.
A multi-channel solar radiometer system called the Atmospheric Optical Calibration System (AOCS) has been developed at the Solar Energy Research Institute to perform real-time measurements of the optical transmittance properties of the atmosphere at selected wavelengths within the solar spectral regime (.3 to 3 μm). These measurements are analyzed in conjunction with a simple spectral solar radiation model to calculate real-time solar spectral irradiance values. The AOCS will be used to relate existing atmospheric optical and solar radiation conditions to selected standard conditions, so that decisions can be made regarding the outdoor testing of solar energy conversion devices (i.e., photovoltaic devices/cells. The concept for this instrument was originally described by Hulstrom and Cannon.
A large number of infrared target simulators from automatic test sets must be calibrated annually. This paper describes the apparatus and the techniques used to achieve this capability. The target simulators must be adjusted to produce the required levels of irradiance. The calibration is performed using a radiometer with two narrow band spectral filters, each calibrated for irradiance responsivity. An estimate is made of the total attenuation of the target simulator optics. From this, a temperature is calculated that is required to produce the test level of irradiance in the spectral response band of the unit under test. These calculations are built into tables of calibration for the radiometer for ease of use by operators unfamiliar with infrared measurements. Also discussed is the error analysis of this measurement process. The error of interest is the error in setting the irradiance to the required test levels, which is larger than just the error of measuring the irradiance. Specialized measure-ments were made to determine the magnitude of several error sources that are not calculable.
Spectroradiometric measurements of the ultraviolet output of a GE F404 aircraft engine were made over the wavelength range of 200 to 320 nm. The tests were conducted at the GE Lynn, Mass. Riverworks facility in the F404 ram cell. The severe environmental conditions associated with the test cell required a special acoustical noise-proof and mechanical shock-proof enclosure for the double monochromator and UV detectors along with special long cabling to the externally located radiometer and automatic data reduction system. The tests successfully provided spectral irradiance measurements of the afterburner over the 225-320 nm wavelength range with a UV-enhanced silicon detector and over the 200-260 nm range with a PMT detector.
From the Earth Radiation Budget Satellite (ERBS) and the National Oceanic and Atmospheric Administration NOAA 9 and 10 spacecraft platforms, the NASA Earth Radiation Budget Experiment (ERBE) is making absolute measurements of the incoming solar, shortwave Earth-reflected solar, and longwave Earth-emitted fluxes using scanning and nonscanning radiometers. Each of the three spacecraft platforms carries narrow field-of-view (FOV) scanning shortwave, longwave, and total radiometers which measure the radiation in the broadband spectral regions from 0.2 to 5.0 microns, from 5.0 to 50.0 microns, and from 0.2 to 50.0 microns, respectively. The radiometers' detection sensors are thermistor bolometers, coated with a black paint which has a very high coefficient of absorptance. In a laboratory vacuum environment, the gains of the total and longwave radiometers were characterized with uncertainties approaching 1% using the master reference blackbody (MRBB) which is tied to the international practical temperature scale of 1968. Also in vacuum, the gains of the shortwave radiometers were characterized using a 50.8-cm diameter integrating sphere. The sphere was calibrated absolutely using the total and longwave radiometers as transfer standards from the MRBB. The calibration models and approaches which were used to characterize the output signals of the scanning radiometers are described in this paper. Brief descriptions of the scanners and flight calibration systems are presented.
Proc. SPIE 1109, An Improved Electrothermal Model For The ERBE Nonscanning Radiometer: Comparison Of Predicted And Measured Behavior During Solar Observations, 0000 (26 September 1989); doi: 10.1117/12.960722
An improved dynamic electrothermal model of the ERBE total, nonscanning channels has been formulated and implemented as a computer program. This model, which is a modification of an earlier model, is used to simulate two types of solar observation: those obtained through the solar port during solar calibration, and those obtained during the satellite pitchover maneuver in which the sun is observed by the radiometer while this latter is in its Earth-viewing configuration. New results of both simulations are compared with actual flight data. These results show an improved agreement between the simulated and observed radiometer response over previous simulations. The improvement in these severe cases justifies the modification to the model and establishes its accuracy. Thermal noise has been studied also, using a separate model, to evaluate its contribution to the radiative energy absorbed by the active cavity. This study has revealed that scattering of the collimated solar radiation contributes, on average, 0.071 mW during solar calibration, and 0.207 mW during the pitchover maneuver. On the other hand, the maximum amounts of diffuse power due to emission from the field-of-view (FOV) limiter and the aperture plate are, respectively, 0.120 and 0.011 mW, which amount to 0.270 and 0.011 percent of the peak power that enters the cavity (≈45 mW). Finally, the cavity self-contamination contributes only 0.034 mW, or 0.071 percent of the peak power absorbed by the active cavity radiometer. This study confirms the assumption that, due to the geometry of the radiometer assembly and the optical properties of its components, thermal noise is well within the range of previous estimates.
The Shuttle Solar Backscatter Ultraviolet Spectrometer (SSBUV) has been calibrated for its first Shuttle flight, currently scheduled for Autumn, 1989. The purpose of the SSBUV instrument is to provide regular in-orbit calibration checks of the SBUV/2 ozone monitoring instruments being flown routinely on NOAA satellites. The in-orbit calibration transfer will be accomplished by comparing the observations of the Shuttle and satellite instruments. The observables are the solar irradiance and the backscattered terrestrial radiance in the wavelength region between 252 and 340 nm. The Shuttle instrument is carefully calibrated before and after each flight. The long-term ozone monitoring program requires reduction of uncharacterized drifts in the satellite instruments to a value less than the expected ozone trend at the 95% confidence level. This translates to a requirement that the SSBUV be calibrated to a one sigma precision of 1% from one flight to the next. A detailed SSBUV calibration plan establishes procedures for meeting this requirement. Radiometric standards provided by the National Institute for Standards and Technology (NISI) are utilized to determine instrument response and stability. A hierarchy of standards is employed to provide redundancy and minimize biases. Laboratory calibration fixtures were designed to minimize set-up induced systematic errors. Calibration procedures and results are discussed. The initial tests on the accuracy and precision of the wavelength calibration, radiometric linearity, and irradiance and radiance response suggest that the 1% calibration precision requirement can be achieved.
A total radiation thermometer (comprising a cryogenic radiometer which measures the total radiation emitted from a black-body cavity) has been developed at NPL as an alternative instrument to the traditional gas thermometer for measuring thermodynamic temperature.' The total radiant exitances, E(T) and E(To), of a black body at temperatures T and T. respectively, can be expressed as a ratio E(T)/E(To) = T4/To4, where To = 273.16 K, the temperature of the triple point of water. Hence, by measuring the ratio E(T)/E(To) a value of T can be determined. The uncertainty in T using this thermometer is about 1 mK, equivalent to measuring the black-body radiation with an uncertainty of about 1.5 parts in 105. If at the same time the black-body temperature is measured by a platinum resistance thermometer calibrated on IPTS68, a value of T68 can be realised, and differences T-T68 can be deduced. In this paper values of T-T68 will be presented in the temperature range -130 to +110°C and compared to recent gas thermometry in the same range. This will be followed by a discussion of the changes that are being made to the apparatus so that the temperature range can be extended up to 460°C. In particular, (a) modifying the radiometer so that it is possible to measure the increased ratio E(T)/E(T0), and (b) the development of a new black body of unusual design.
Two cryogenic radiometers (electrical substitution radiometers cooled to liquid helium temperatures) have been developed at NPL. The first was used to realise thermodynamic temperature and to determine the Stefan-Boltzmann constant. The latter measurement confirmed the calculation that the radiometer is an absolute instrument with an uncertainty of 0.02%. The second radiometer was dedicated to optical radiation measurements with a calculated uncertainty of 0.005%. This paper describes and presents results of the intercomparison of the two radiometers by comparing the spectral responsivity of silicon photodiodes as independently determined by the radiometers.
A convenient cryogenic electrical substitution radiometer has been developed for laser radiometry. The radiometer system consists of a side-viewing active cavity receiver of 10 second time constant, a table-top liquid helium cryostat with a Brewster-angle window port, and two digital temperature controllers interfaced with an AT-class microcomputer. Radiometer measurements of an intensity-stabilized laser beam have been compared to measurements taken with silicon photodiodes self-calibrated using a new, non-destructive technique. Agreement between these two techniques of light measurement is obtained at the 0.05% level. The disagreement at the 0.05% level may be significant, and we put forward several possible explanations for future investigation.
The current status of silicon self-calibration and its applications are reviewed, including the results of a number of intercomparisons that establish the suitability of self-calibration for high accuracy applications. Some current research directions known to the author are described, and possible future directions are considered.
Recent advances in detector fabrication and passivation technology have led to the realization of high performance germanium photodiodes with improved reliability suitable for use in optical radiometric measurement applications. Among the most significant accomplishments are the development of planar devices via ion implantation; the effective use of activation anneal and damage removal schemes; and the successfull application of plasma enhanced chemical deposition of oxynitride films in surface passivation. Several key parameters are used to measure the enhancement of the radiometric quality of the device. These include spectral external and internal quantum efficiency, long term stability of the responsivity and the dark current, and spatial uniformity across the active area. Finally, experimental results are compared with theoretical calculations leading to recommendations for further optimization of germanium photodiode performance.
Neutron fluence testing of large dimension electrical components at cryogenic temperatures has traditionally been performed at fast burst reactors. A new facility, using Californium-252 (252Cf) as a neutron radiation source, was designed and built at the IRT Corporate Headquarters in San Diego. This paper discusses the physical properties of the californium isotope, describes the facility and its test capabilities, and presents the results of the 1 MeV calibration as well as IR and visible sensor testing.