Lens-free imaging (LFI) has become an important microscopy tool in many life science and industrial applications. Due to the absence of optical lenses (such as objectives) and accompanying lens aberrations (such as chromatic aberrations), the LFI modality is well suited for optical inspection of microscopic objects in a wide spectral range. However, the relatively restricted spectral sensitivity of CMOS imagers, i.e. from visible (~400 nm) up to near-infrared range (~900 nm), limits the wide spectral use of the technique. Many microscopic samples contain valuable information both in the visible and in the short wave infra-red (SWIR), sometimes in addition to visible (VIS) and near-infrared (NIR). With the recent emergence of cost-effective image sensor technologies such as quantum-dot and graphene-based image sensors with high quantum efficiency in SWIR, new lens-free imaging opportunities are emerging for wideband and high throughput microscopy. We demonstrate for the first time an LFI system based on a quantum-dot image sensor, capable of operating in both the visible and short-wave infrared range. The holograms of the samples are obtained through multiple partially coherent illumination sources in both visible and short-wave infrared (ranging from 405 nm to 1550 nm). The captured holograms are reconstructed to obtain images of the sample in focus. We demonstrate an optical resolution of 3.48 micron in a field of view of 9.6 mm2 over the whole spectral range. Our technique mitigates the need for bulky and expensive achromatic imaging optics and offers significant improvements in cost, field-of-view, scalability, and optical resolution to achieve microscopic imaging in both the visible and short-wave infrared spectral range with a simple imaging system. We present in this paper a performance analysis of the system and several potential applications and use cases.
We report a fast and effective lens-free imaging platform for optical diffraction tomography (ODT). Using single wavelength illumination from only 4 angular directions and a lensless inline holographic imaging setup to directly capture the resulting diffraction patterns, our method can reconstruct high quality 3D images of biological samples at micron-scale resolution across a cubic-millimeter-level volume with a compact, scalable and inexpensive system. To achieve this, we developed a compressive tomographic reconstruction algorithm to solve the inverse problem of lens-free ODT by combining Wirtinger derivatives and primal-dual splitting. This fast and inexpensive lens-free tomographic microscopy system provides a promising 3D imaging approach for high throughput biomedical applications.
Lens-free holographic microscopy (LHM) is a promising imaging technique for life science and industrial applications, yet system miniaturization and cost reduction without compromising imaging performance remain challenging for field applications in low-resource settings. We demonstrate a cost-effective LHM system without needs for precision optical and mechanical parts (such as lenses, beam-splitters, or kinematic stages) and relies solely on robust optoelectronic hardware and software co-design for high performance imaging. The compact and lightweight form-factor is achieved through integration of light sources, an image sensor and all control electronics with automated calibration and multiwavelength reconstruction algorithms. Amplitude and phase images of a sample can be reconstructed in a few seconds with a micron level optical resolution in a field-of-view of 16.5 mm2. The method offers a portable and scalable solution for microscopic imaging applications.
This paper presents system-level analysis of a sensor capable of simultaneously acquiring both standard absorption based RGB color channels (400-700nm, ~75nm FWHM), as well as an additional NIR channel (central wavelength: ~808 nm, FWHM: ~30nm collimated light). Parallel acquisition of RGB and NIR info on the same CMOS image sensor is enabled by monolithic pixel-level integration of both a NIR pass thin film filter and NIR blocking filters for the RGB channels. This overcomes the need for a standard camera-level NIR blocking filter to remove the NIR leakage present in standard RGB absorption filters from ~700-1000nm. Such a camera-level NIR blocking filter would inhibit the acquisition of the NIR channel on the same sensor. Thin film filters do not operate in isolation. Rather, their performance is influenced by the system context in which they operate. The spectral distribution of light arriving at the photo diode is shaped a.o. by the illumination spectral profile, optical component transmission characteristics and sensor quantum efficiency. For example, knowledge of a low quantum efficiency (QE) of the CMOS image sensor above 800nm may reduce the filter’s blocking requirements and simplify the filter structure. Similarly, knowledge of the incoming light angularity as set by the objective lens’ F/# and exit pupil location may be taken into account during the thin film’s optimization. This paper demonstrates how knowledge of the application context can facilitate filter design and relax design trade-offs and presents experimental results.
This paper presents multispectral active gated imaging in relation to the transportation and security fields. Active gated imaging is based on a fast gated camera and pulsed illuminator, synchronized in the time domain to provide range based images. We have developed a multispectral pattern deposited on a gated CMOS Image Sensor (CIS) with a pulsed Near Infrared VCSEL module. This paper will cover the component-level description of the multispectral gated CIS including the camera and illuminator units. Furthermore, the design considerations and characterization results of the spectral filters are presented together with a newly developed image processing method.
Traditional spectral imaging cameras typically operate as pushbroom cameras by scanning a scene. This approach makes
such cameras well-suited for high spatial and spectral resolution scanning applications, such as remote sensing and
machine vision, but ill-suited for 2D scenes with free movement. This limitation can be overcome by single frame,
multispectral (here called snapshot) acquisition, where an entire three-dimensional multispectral data cube is sensed at
one discrete point in time and multiplexed on a 2D sensor.
Our snapshot multispectral imager is based on optical filters monolithically integrated on CMOS image sensors with
large layout flexibility. Using this flexibility, the filters are positioned on the sensor in a tiled layout, allowing trade-offs
between spatial and spectral resolution. At system-level, the filter layout is complemented by an optical sub-system
which duplicates the scene onto each filter tile. This optical sub-system and the tiled filter layout lead to a simple
mapping of 3D spectral cube data on the sensor, facilitating simple cube assembly. Therefore, the required image
processing consists of simple and highly parallelizable algorithms for reflectance and cube assembly, enabling real-time
acquisition of dynamic 2D scenes at low latencies. Moreover, through the use of monolithically integrated optical filters
the multispectral imager achieves the qualities of compactness, low cost and high acquisition speed, further
differentiating it from other snapshot spectral cameras. Our prototype camera can acquire multispectral image cubes of
256x256 pixels over 32 bands in the spectral range of 600-1000nm at 340 cubes per second for normal illumination