8.1 Balancing Optics and Sensor Resolution in the Imaging Chain It is important at this point to look at the combined effect of the optics and sensor on the image quality produced by the camera (Fig. 8.1). We have shown that the optics and the digital sensor each impose a fundamental limit on the detail that can be imaged with a digital camera. The optics spreads out each point from the scene into a blurred spot, and the detector array divides up and samples the scene into pixels. The Q of a digital camera defines how these two limitations are balanced in the camera design. Understanding these two effects and how they can be managed in the imaging chain is very important to understanding the image quality. 8.2 Spatial Resolution The spatial resolution is defined as the smallest separation between two objects in the scene that allows them to still be resolved as two separate objects in the image (Fig. 8.2). The most common metric for spatial resolution is the Rayleigh criterion, suggested by Lord Rayleigh in 1879, that is based on the diffraction PSF of light from a clear circular aperture. The Rayleigh criterion states that two points are just resolvable when the location of one point lies on the first zero of the Airy disk from the second point (Fig. 8.3), i.e., the points are separated by (8.1) |
|