Radiometry is an essential part of the optical design of almost every optical instrument. Such instruments are usually used to focus and detect radiation for some particular purpose, and for many applications it is absolutely essential to know how much radiation gets to the detector array or film in the image plane and the value of the resultant signal-to-noise ratio or exposure. Radiometry is almost essential in another sense, the measurement of the radiation of various objects. In fact, the word âradiometryâ itself means the measurement of radiation. One cannot make the above-mentioned calculations without a knowledge of the flux from the source, whether it be a tungsten bulb or a sun-illuminated vista. Therefore, this text on radiometry involves both the techniques of calculating radiative transfer and the measurement of fluxes and radiometric properties of different sorts.
The most primitive beginnings of radiometry must have been the observation by early man of the different brightnesses of stars and the sensing of the warmth from the sun and the fire (after he invented or discovered it). These were radiometric measurements, but they surely were not quantitative. Greek astronomers, especially Ptolemy and Hipparchus, made good estimates of star magnitudes, and these were extended by Galileo.
The history of quantitative radiometry surely begins with the practice of photometry, the measurement of visible light. It was first put on an organized basis by Pierre Bouguer in 1729 when he described an instrument that could compare the brightnesses of two sources.
Online access to SPIE eBooks is limited to subscribing institutions.