Lens-free holographic microscopy (LHM) is a promising imaging technique for life science and industrial applications, yet system miniaturization and cost reduction without compromising imaging performance remain challenging for field applications in low-resource settings. We demonstrate a cost-effective LHM system without needs for precision optical and mechanical parts (such as lenses, beam-splitters, or kinematic stages) and relies solely on robust optoelectronic hardware and software co-design for high performance imaging. The compact and lightweight form-factor is achieved through integration of light sources, an image sensor and all control electronics with automated calibration and multiwavelength reconstruction algorithms. Amplitude and phase images of a sample can be reconstructed in a few seconds with a micron level optical resolution in a field-of-view of 16.5 mm2. The method offers a portable and scalable solution for microscopic imaging applications.
This paper presents system-level analysis of a sensor capable of simultaneously acquiring both standard absorption based RGB color channels (400-700nm, ~75nm FWHM), as well as an additional NIR channel (central wavelength: ~808 nm, FWHM: ~30nm collimated light). Parallel acquisition of RGB and NIR info on the same CMOS image sensor is enabled by monolithic pixel-level integration of both a NIR pass thin film filter and NIR blocking filters for the RGB channels. This overcomes the need for a standard camera-level NIR blocking filter to remove the NIR leakage present in standard RGB absorption filters from ~700-1000nm. Such a camera-level NIR blocking filter would inhibit the acquisition of the NIR channel on the same sensor. Thin film filters do not operate in isolation. Rather, their performance is influenced by the system context in which they operate. The spectral distribution of light arriving at the photo diode is shaped a.o. by the illumination spectral profile, optical component transmission characteristics and sensor quantum efficiency. For example, knowledge of a low quantum efficiency (QE) of the CMOS image sensor above 800nm may reduce the filter’s blocking requirements and simplify the filter structure. Similarly, knowledge of the incoming light angularity as set by the objective lens’ F/# and exit pupil location may be taken into account during the thin film’s optimization. This paper demonstrates how knowledge of the application context can facilitate filter design and relax design trade-offs and presents experimental results.
This paper presents multispectral active gated imaging in relation to the transportation and security fields. Active gated imaging is based on a fast gated camera and pulsed illuminator, synchronized in the time domain to provide range based images. We have developed a multispectral pattern deposited on a gated CMOS Image Sensor (CIS) with a pulsed Near Infrared VCSEL module. This paper will cover the component-level description of the multispectral gated CIS including the camera and illuminator units. Furthermore, the design considerations and characterization results of the spectral filters are presented together with a newly developed image processing method.
Traditional spectral imaging cameras typically operate as pushbroom cameras by scanning a scene. This approach makes
such cameras well-suited for high spatial and spectral resolution scanning applications, such as remote sensing and
machine vision, but ill-suited for 2D scenes with free movement. This limitation can be overcome by single frame,
multispectral (here called snapshot) acquisition, where an entire three-dimensional multispectral data cube is sensed at
one discrete point in time and multiplexed on a 2D sensor.
Our snapshot multispectral imager is based on optical filters monolithically integrated on CMOS image sensors with
large layout flexibility. Using this flexibility, the filters are positioned on the sensor in a tiled layout, allowing trade-offs
between spatial and spectral resolution. At system-level, the filter layout is complemented by an optical sub-system
which duplicates the scene onto each filter tile. This optical sub-system and the tiled filter layout lead to a simple
mapping of 3D spectral cube data on the sensor, facilitating simple cube assembly. Therefore, the required image
processing consists of simple and highly parallelizable algorithms for reflectance and cube assembly, enabling real-time
acquisition of dynamic 2D scenes at low latencies. Moreover, through the use of monolithically integrated optical filters
the multispectral imager achieves the qualities of compactness, low cost and high acquisition speed, further
differentiating it from other snapshot spectral cameras. Our prototype camera can acquire multispectral image cubes of
256x256 pixels over 32 bands in the spectral range of 600-1000nm at 340 cubes per second for normal illumination