Analysis of multiple factors affecting the uncertainty of the absolute spectral responsivity of optical radiation detectors is
presented. This includes both the validation of the radiometric scale of the infrared reference detectors and the scale
transfer process to the unit under test. Reference detectors include a low NEP pyroelectric detector, an InSb detector, and
a sphere-input extended InGaAs detector. While all three types of reference detectors were calibrated independently, less
than 0.5 % mismatch of spectral responsivities was observed in the spectrally overlapping regions. We provide a
performance evaluation of the NIST IR Detector Calibration Facility, which was designed for testing optical radiation
detectors in both radiant power and irradiance measurement modes. This facility utilizes a high throughput
monochromator with interchangeable diffraction gratings. Depending on the spectral range, a blackbody at 1100°C, or a
quartz halogen lamp with about a 10-4 long-term relative output variation was used as a radiation source. In order to
minimize the uncertainty budget for calibration data, specific attention was given to the profile of the incident beam, the
precise positioning of detectors, and the influence of atmospheric absorption. In addition to the spectral responsivity
calibration of detectors, the facility allows precise mapping of detector active area for spatial non-uniformity of response.
Typical calibration uncertainties that can be achieved are about 1 % and 2.5 % (k=2) in the radiant power and irradiance
modes, respectively. Examples for responsivity calibrations of different detectors are presented.