High fidelity color image acquisition requires an accurate characterization of the camera's spectral sensitivity
curves to perform color calibration or spectral estimation. Several methods have been proposed to perform this
task; these include characterizations via test charts, narrowband filters and methods utilizing a monochromator.
In most publications, RGB cameras are characterized. In this paper, we describe the characterization of the
spectral sensitivity curves of a multispectral camera featuring seven optical bandpass filters. We show two
different methods for the calibration using a monochromator - either by measuring the grayscale sensor of
the camera and the filters separately or by characterizing the multispectral camera as a complete system. A
comparison of both methods validates the measurement results. We furthermore develop different reconstruction
methods (maximum value method, principal eigenvector method, linear or Wiener estimation). We perform also
simulations of the characterization process to evaluate the methods and show the impact of the bandwidth of
the monochromator stimuli on the reconstruction.
Conventional point spread function (PSF) measurement methods often use parametric models for the estimation
of the PSF. This limits the shape of the PSF to a specific form provided by the model. However, there are
unconventional imaging systems like multispectral cameras with optical bandpass filters, which produce an, e.g.,
unsymmetric PSF. To estimate such PSFs we have developed a new measurement method utilizing a random noise
test target with markers: After acquisition of this target, a synthetic prototype of the test target is geometrically
transformed to match the acquired image with respect to its geometric alignment. This allows us to estimate the
PSF by direct comparison between prototype and image. The noise target allows us to evaluate all frequencies
due to the approximately "white" spectrum of the test target - we are not limited to a specifically shaped PSF.
The registration of the prototype pattern gives us the opportunity to take the specific spectrum into account
and not just a "white" spectrum, which might be a weak assumption in small image regions. Based on the PSF
measurement, we perform a deconvolution. We present comprehensive results for the PSF estimation using our
multispectral camera and provide deconvolution results.
Optical bandpass filters play a decisive role in multispectral imaging. Various multispectral cameras use this
type of color filter for the sequential acquisition of different spectral bands. Practically unavoidable, small tilt
angles of the filters with respect to the optical axis influence the imaging process: First, by tilting the filter,
the center wavelength of the filter is shifted, causing color variations. Second, due to refractions of the filter,
the image is distorted geometrically depending on the tilt angle. Third, reflections between sensor and filter
glass may cause ghosting, i.e., a weak and shifted copy of the image, which also depends on the filter angle. A
method to measure the filter position parameters from multispectral color components is thus highly desirable.
We propose a method to determine the angle and position of an optical filter brought into the optical path
in, e.g., filter-wheel multispectral cameras, with respect to the camera coordinate system. We determine the
position and angle of the filter by presenting a calibration chart to the camera, which is always partly reflected
by the highly reflective optical bandpass filter. The extrinsic parameters of the original and mirrored chart can
then be estimated. We derive the angle and position of the filter from the coordinates of the charts. We compare
the results of the angle measurements to a ground truth obtained from the settings of a high-precision rotation
table and thus validate our measurement method. Furthermore, we simulate the refraction effect of the optical
filter and show that the results match quite well with the reality, thus also confirming our method.
Capturing natural scenes with high dynamic range content using conventional RGB cameras generally results
in saturated and underexposed and therefore compromising image areas. Furthermore the image lacks color
accuracy due to a systematic color error of the RGB color filters. The problem of the limited dynamic range
of the camera has been addressed by high dynamic range imaging<sup>1, 2 </sup>(HDRI): Several RGB images of different
exposures are combined into one image with greater dynamic range. Color accuracy on the other hand can be
greatly improved using multispectral cameras,<sup>3</sup> which more accurately sample the electromagnetic spectrum.
We present a promising combination of both technologies, a high dynamic range multispectral camera featuring
a higher color accuracy, an improved signal to noise ratio and greater dynamic range compared to a similar low
dynamic range camera.