Translator Disclaimer
In 1800 William Herschel discovered infrared flux using a thermometer as the first infrared (IR) detector. In his experiments, a prism was used to refract sunlight. A thermometer placed just outside the red edge of the spectrum indicated a higher temperature than in the rest of the room. Early IR detectors exploited the Seebeck thermoelectric effect used in the first thermocouple devices. The origins of modern IR detector technology can be traced to the 20th century, during World War II, when photon detectors were developed. Since World War II, IR detector technology development was and continues to be primarily driven by military applications, although in the last few decades its application in civilian fields such as medicine, quality control, anti-threat systems, and industrial processes, among others, has grown substantially. This diversity of applications, as well as the advances in the semiconductor sciences and fabrication processes, leads to cost-effective devices and systems, placing IR technology in current daily life. When a new system is brought to the market today, the design specifications often consider ‘dual deployment,’ targeting civil and military applications. Optical detectors are used as components in electro-optical sensor systems (see Chapter 1). The broader IR technology field concerns itself with the study of how a heated source radiates energy, how this radiation propagates through a medium, how it interacts with matter, and finally, how it is detected. This chapter provides an introduction to IR detectors. The focus is on concepts and principles and not on specific detector materials or technologies. The classical, first-order theory presented here is suitable for basic understanding but does not cover advanced concepts or secondary effects. Starting with the physics of light absorption, the focus shifts to detector types, noise, thermal detectors, and photon detectors.
Online access to SPIE eBooks is limited to subscribing institutions.

Back to Top