Open Access
9 March 2019 Large field-of-view phase and fluorescence mesoscope with microscopic resolution
Isaure de Kernier, Anaïs Ali-Cherif, Nelly Rongeat, Olivier Cioni, Sophie Morales, Julien Savatier, Serge Monneret, Pierre Blandin
Author Affiliations +
Abstract
Phase and fluorescence are complementary contrasts that are commonly used in biology. However, the coupling of these two modalities is traditionally limited to high magnification and complex imaging systems. For statistical studies of biological populations, a large field-of-view is required. We describe a 30  mm2 field-of-view dual-modality mesoscope with a 4-μm resolution. The potential of the system to address biological questions is illustrated on white blood cell numeration in whole blood and multiwavelength imaging of the human osteosarcoma (U2-OS) cells.

1.

Introduction

Microscopy systems continuously strive to provide superior image quality through higher resolution and signal-to-noise ratio (SNR), resulting in great imaging performances but with bulky, ultraspecific, and expensive systems. Furthermore, these systems tend to sacrifice the field-of-view (FOV) for the sake of resolution. Recently, some research teams have loosened the requirements for image quality and have started to develop miniaturized and affordable systems for point-of-care applications in developing countries.1 Unfortunately, these imaging systems are usually unimodal.

Phase imaging2 offers a label-free contrast to image unstained cells that would have a low contrast in brightfield microscopy. Phase contrast holds information on an object’s optical path length, i.e., the product of its thickness and refractive index.3 This information can be used to study cell morphology, cell–cell interactions, and so on.4 Since Zernike first discovered phase contrast,5 numerous methods have been developed to improve both contrast and resolution and retrieve quantitative information. One technique is digital holographic microscopy,6 which uses the theoretical description of diffraction to enable numerical reconstruction of acquired holograms. More recently, techniques such as Fourier phase microscopy7 and diffraction phase microscopy8 have focused on achieving common-path phase imaging. Finally, some research groups have achieved phase imaging with white-light illumination, namely, phase imaging based on the transport of intensity equation,9,10 spatial light interference microscopy,11 and quadriwave lateral shearing interferometry.12 In-line holography13 is a phase-imaging technique compatible with full-field low-magnification imaging and is straight forward to implement. It is used to obtain phase images in lensless14 and defocused configurations.15 Amplitude and phase maps can be numerically reconstructed from a single holographic frame. However, phase imaging does not provide any specificity on populations of objects or subobjects that are morphologically alike.

Fluorescence provides a complementary contrast to phase when observing biological objects.16 It allows obtaining specificity and has become a reference technique in biological microscopy.17 It is an intrinsically selective technique. By using the appropriate probe, virtually any aspect of a biological system can be labeled and imaged with a high SNR. Specific cells, structures in a given cell, or specific functions of a cell can be highlighted. Simultaneous multiwavelength imaging can additionally be achieved and enables visualization of protein interaction, cellular and intracellular dynamics, intracellular structures, and so on.17 In high-throughput imaging of cells, the added value of specificity can enable discrimination between subpopulations or the localization of a rare event.18

Combining phase and fluorescence contrasts has been demonstrated on research and commercial systems in both two-dimensional16,19 and three-dimensional20 imaging at diffraction-limited resolutions and beyond.21 Several life science applications have been targeted, showing the potential of such an approach for molecular and cellular diagnostics.22 Coupling fluorescence to brightfield is usually performed using magnification lenses—typically 20× to 50×. To achieve a larger FOV, some attempts have been reported using specifically designed optical lenses.23,24 A few setups use consumer single lens reflex (SLR) camera lens,2527 but this is not in widespread use in scientific imaging. Their large numerical apertures, their robustness, and their price can be considered as advantages. Except for lensless approaches, most devices coupling fluorescence to phase do so at high magnifications. However, owing to the low resolution and poor SNR of the raw acquisitions, lensless imaging requires computational efforts and a priori information to deconvolve the out-of-focus fluorescence image.28

In this paper, we describe a wide FOV phase and fluorescence imaging system. We performed coupled fluorescence and in-line holography imaging on a 30-mm2 FOV by means of a common path setup. To our knowledge, this is the first report of coupling fluorescence imaging and phase imaging on an ultrawide FOV with a micrometric resolution. First, the method for numerical reconstruction of phase contrast is detailed. We then present the system and its components prior to evaluating its performances using calibration targets. Finally, we demonstrate its potential for statistical imaging and cell imaging.

2.

Coupled Phase and Fluorescence Imaging

The system we introduce combines a phase contrast reconstruction from an in-line holographic single-shot acquisition with fluorescence. First, the phase contrast is discussed and then the setup is described.

2.1.

Phase Contrast Reconstructions

We consider a planar approximation of an object located in a transverse plane z=0. A normalized monochromatic scalar field propagating in free space along the z axis and incident on that object creates a complex optical field Uz(r) that can be described in the in-focus plane by Eq. (1), where A0 is the amplitude and φ0 is the phase distribution in this plane:

Eq. (1)

U0=A0·eiφ0.

Let r be the coordinates in the transverse plane; z be the coordinate in the propagation direction, with a reference z=0 for the object plane; and R be the distance in space away from the object.

If we consider the diffraction that arises from illuminating the object with a semicoherent plane light field, we can write the wave’s complex amplitude in any transverse plane z0 in terms of the wave amplitude in the prior plane z=0:

Eq. (2)

Uz(r)=12πz[U0(r)*e2πλiRR],
where “*” is the convolution operator. This results from the Rayleigh–Sommerfeld diffraction theory9,29 and mathematically describes the propagation of a field in space. It is valid throughout the entire diffraction space. At a certain distance R away from the object, we can make the Fresnel approximation: R=(z2+r2)12z+r22z, which is admitted to be valid for z(π4λr4)13, as detailed in Ref. 30. Let hz be the Fresnel propagator defined as hz=1iλzeiπr2λz. Then the Fresnel diffraction propagation model can be written as:

Eq. (3)

Uz(r)=eikz[U0(r)*hz].

Sensors are only sensitive to the irradiance, i.e., the square of the modulus of the complex field, as stated by Eq. (4). Therefore, the phase information is lost in the acquisition process:

Eq. (4)

Iz(r)=Uz(r)·Uz(r)*=|Uz(r)|2.

The complex field in the object plane U0(r) can be numerically reconstructed from an acquisition of irradiance in a single out-of-focus plane Iz(r). This in-line diffraction pattern is often referred to as a hologram.

Numerous reconstruction methods are described in the literature for the reconstruction of phase maps from holograms. Here, we used a state-of-the art iterative reconstruction algorithm from a single image, which is described in Ref. 31. To find the phase in the sensor plane φz, the L1-norm grad(hz*Izeiφz)1 is minimized using gradient descent. Such a scheme forces a fit to the acquired data Iz. Once a value is found for the phase in the sensor plane φz, the complex field in the object plane A0·eiφ0 can be retrieved by computing hz*Iz(r)eiφz. The phase map is reconstructed from a single in-line hologram. Absolute values can only be obtained at the cost of a priori information or constraints on the object associated with a calibration relative to a quantitative reference method. Therefore, the phase contrast is not considered quantitative in our configuration. Other approaches, in particular off-axis configurations combined with phase unwrapping algorithms,32 multiwavelength illumination,33 or multiheight acquisitions,34 allow the recovery of quantitative phase maps. However, they require acquiring multiple images, whereas our holographic approach is a single shot.

2.2.

Instrument Development

The combination of phase and fluorescence requires the images to be registered. Either hardware or software registrations may be considered. Numerical registration between two modalities has been intensely studied but remains a challenge. Therefore, it is advantageous for instrumentation to enable the acquisition of already-registered images. In this paper, we implemented a common-path configuration with a single sensor. For this reason, multimodal images did not require post-treatment registration. With the mesoscope system, as illustrated in Fig. 1, two images could be sequentially acquired: one out-of-focus transmission image that is processed to obtain phase and amplitude maps and one in-focus fluorescence image. In addition, the system could be used for absorption-contrast brightfield imaging.

Fig. 1

Schematic representation of the mesoscope bimodal imaging system.

JBO_24_3_036501_f001.png

Performing statistical imaging of micrometric biological objects requires a large FOV and a micrometric resolution. In microscopy, a compromise must be found between the two. Large FOVs result from low-magnification objectives, whereas high resolution is associated with high-magnification objectives. Both factors are linked to the numerical aperture. We aimed to develop a 1× magnification system to enable larger FOV imaging, and thus optimize the number of objects that could be imaged in a single shot. To achieve this goal, we sought a 1× objective with a high numerical aperture and a wide FOV. Air microscopy objectives typically have a limited numerical aperture at low magnifications and a FOV of around 25 mm in diameter. We chose a cost-effective SLR camera lens from the macrophotography series of Canon Inc. (EF 100 mm f/2.8 Macro USM, Canon Inc.). SLR camera lenses are designed to be used without tube lenses.

This objective has a front lens of 58 mm in diameter, enabling it to achieve a numerical aperture of 0.113±0.003. To experimentally measure this value, we collimated the light from a green light-emitting diode (LED) (XLamp® XM-L™ Color LED, Cree Inc.) with a large aspheric condenser lens (ACL756U-A, Thorlabs Inc.). This was mounted on a goniometer stage and the resulting 75-mm-diameter beam could be tilted away from a central position, which is defined as the normal incidence on the objective. The precision of the stage was 0.1 deg. The numerical aperture was defined in regard to the cut-off angles θmax and θmin of the resulting curve: NA=sin(θmaxθmin2), as illustrated in Fig. 2. The uncertainty on this value is the error induced on the cut-off angles by a 0.1-deg error on the inclination angle. This numerical aperture corresponds to a theoretical depth of field30 of 20.4  μm. The thickness of the samples we studied was typically below this value, making the planar approximation a decent model for these objects.

Fig. 2

Experimental evaluation of the effective numerical aperture of the system. The crosses show the experimental data points. The continuous line is a linear approximation of the experimental data. The black box points out the numerical aperture.

JBO_24_3_036501_f002.png

The compromise between resolution and FOV can be measured by the space–bandwidth product (SBP), i.e., the number of pixels required to capture the full area at full resolution. SBP=FOVps2, where FOV is the circular FOV at the image plane disregarding the sensor used and ps is the pixel size required to achieve the Nyquist sampling, i.e., half of the resolution defined by the Airy disk radius.29 This is a reasonable method to quantify the amount of information transmitted by the optical system.35

Figure 3 illustrates the limited SBP of standard Zeiss microscopy objectives and the improvements that can be made using larger optics23 or multiple angle illumination.36 Although the SBP of SLR camera lenses differ from those of dedicated microscopy objectives by two orders of magnitude, only a few research groups have reported the use of such lenses for microscopy.

Fig. 3

Comparison of the SBP plotted against magnification. For our system, we show both the available SBP and the SBP that is effectively used. The other systems are Zeiss microscopy objectives (Carl Zeiss, Germany), Fourier Ptychography microscope,36 Zygo objectives (Zygo Corp.),37 and custom-developed Mesolens.23

JBO_24_3_036501_f003.png

One difficulty lies in acquiring the total available SBP in a single shot. Photography objectives are developed for full format 24-×36-mm2 sensors. The available SBP is spread over a disk of 43 mm in diameter in the image plane. Smaller size sensors apply a crop factor. As a general trend, in the past decade, the development of sensor technology has been guided by high-resolution mobile-phone imaging, i.e., pixel size tends to shrink and chip size remains rather small. Therefore, in our system, a trade-off had to be made between chip size and pixel size to optimize the effective SBP.

According to the Shannon–Nyquist criterion, sampling the point spread function (PSF) in a 1× system requires pixels half the size of the PSF. Theoretically, a numerical aperture of 0.113 provides a PSF with a radius of r=0.61λNA2.81  μm at 520 nm. Therefore, we selected a monochromatic complementary metal-oxide-semiconductor (CMOS) 11 mega pixels 6.4×4.6  mm2 sensor with a 1.67-μm pixel pitch (UI-1492LE-M, IDS GmbH, Germany) resulting in a 29.4-mm2 FOV. This pixel size optimized the resolution, but a larger chip size would have improved the effective SBP, as pointed out in Fig. 3. However, to our knowledge, such sensors are not commercially available.

To perform phase imaging, we chose an in-line configuration for digital holography. A diffraction pattern is formed when the object is illuminated with a partially coherent light. It is recommended in Ref. 38 to consider both spatial and temporal coherences. A low degree of coherence reduces the speckle artifacts and unwanted fringes that result from reflections on multiple interfaces. A high degree of coherence, such as that of lasers, creates holograms with more spectral content. To mitigate both effects, we compromised by using a LED (XLamp® XM-L™ Color LED, Cree Inc.) coupled into a 200-μm-core diameter multimode optical fiber (FG200UEA, Thorlabs Inc.).

To minimize the aberrations, we implemented a convergent illumination, as illustrated in Fig. 1. Collimated or divergent illuminations resulted in enhanced geometric aberrations because of the high incidence angles on the border of the objective and prevented accurate phase reconstruction. A singlet lens (LAT075, Thorlabs Inc.), referred to as L3 in Fig. 1, was used to make the illumination convergent on the objective’s front lens. The distance from the optical fiber output to the L3 lens was 143 mm, and that from L3 to the objective ML was 191 mm. As is commonly done in holographic imaging,14 we considered the field curvature negligible in the object plane and made the plane wave approximation. This was required to apply the Fresnel formalism described previously.

The position of the object plane was imposed by the 2f2f configuration that allowed achieving a 1:1 magnification ratio. The distance from the object plane to the objective front lens was 145 mm. Numerous methods exist to obtain out-of-focus images, typically introduction of a dephasing medium39 or manual or mechanical translation of either the object, the objective, or the sensor. We implemented S2, a manual micrometric z axis translation stage (SM1Z, Thorlabs Inc.) with a 25-mm range to move the sensor. This procedure was preferred both because it avoided disturbing liquid samples and because it was mechanically easier than moving the objective. The optimal defocus distance is dependent on the object. We arbitrarily chose it in a range from 50 to 1000  μm to optimize the SNR of the diffraction fringes. As described, the in-line configuration allowed acquiring a diffraction pattern from which a phase-contrast image could be reconstructed.

To introduce fluorescence modality to this system, an excitation source and a high-pass emission filter must be added. The excitation source can be a spatially filtered laser diode (LD), as illustrated in Fig. 1. Alternatively, a monochromatic LED may be used. In addition, it was straightforward to implement multiple excitation sources, thus enabling sequential multiwavelength fluorescence. To avoid having excitation light directly incident on the sensor, the excitation module was implemented with an 45-deg angle from the optical axis. This allowed for the lessening of the constraints on the emission filter; unlike standard epi-illumination microscopy, no additional dichroic filter was used. The distance between the sensor and the objective back lens was 3 cm, which provided enough space to insert the filter in the optical path. Because the brightfield illumination wavelength was chosen higher than the filter’s cutoff wavelength, holographic imaging could be performed without removing the fluorescence emission filter.

3.

Calibration of Optical Performances

3.1.

Transmission

For brightfield imaging, resolution can be defined as the width of the bars in the last resolved group of the amplitude 1951 United States Air Force (USAF) resolution test chart. The uncertainty on the measurement was considered to be half the size difference between the resolved bar and the one from the next group on the target. In transmission, a group was considered resolved if the contrast of its bars, both horizontal and vertical, exceeded 10%. Contrast was defined from the maximum and minimum gray values in the considered line profile as C=maxminmax+min.

Under blue LED illumination (central wavelength 469 nm), we measured a resolution of 2.76±0.15  μm in the center [Figs. 4(a)4(b)] and 3.91±0.23  μm on the edges of the FOV. The resolution in the center of the FOV was 3.10±0.17  μm under green LED illumination (central wavelength 526 nm), i.e., resolution was degraded by the use of higher wavelengths.

Fig. 4

Determination of the resolution in brightfield when the amplitude 1951 USAF resolution test chart was placed in the center of the FOV and illuminated by a blue LED: (a) acquisition and (b) horizontal and vertical line profiles drawn in (a).

JBO_24_3_036501_f004.png

A phase 1951 USAF resolution test chart was used to determine the resolution of the phase reconstruction. A hologram [Fig. 5(a)] of this test chart was acquired by the system under blue-light illumination; in particular, it was blurred by the PSF and sampled by the pixels. The hologram was then reconstructed into a phase map [Fig. 5(b)]. We used the normalized values of the line profiles along a group of the target to determine the resolution [Fig. 5(c)]. The resolution and optimal acquisition distance are highly dependent on the object. We found that at a 910-μm defocus distance, the reconstructed phase image had a 4.38-±0.23-μm lateral resolution. The resolution depends on several factors: the resolution of the acquired hologram, the reconstruction algorithm, and the object properties.

Fig. 5

Determination of the resolution of the phase reconstruction when the phase 1951 USAF resolution test chart was placed in the center of the FOV and illuminated by a blue LED: (a) acquired hologram, (b) phase reconstruction, and (c) normalized horizontal and vertical line profiles drawn in (b).

JBO_24_3_036501_f005.png

As the phase reconstruction was qualitative, the image could also be evaluated based on its SNR. The phase resolution test chart has been designed to have lines with a 183-nm optical path difference. For an 8-bit sensor, the standard deviation of a reconstructed zone with no signal was 1.6 levels of gray. The SNR of the reconstruction was measured to be 11.9 for the smallest resolved line. This validates that all resolved lines were clearly contrasted.

3.2.

Fluorescence Resolution Measurements

Fluorescence resolution was obtained by measuring the PSF of the system. A solution of 1-μm green fluorescent protein (GFP) fluorescent beads was placed on a microscope glass slide, excited at 488 nm, and detected with a high-pass 488-nm emission filter (BLP01-488, Semrock Inc.). Line profiles of isolated beads were drawn, as illustrated in Fig. 6. For a given spatial position, the resolution was defined as the full width at half maximum (FWHM) of a normalized Gaussian curve fit on the experimental data points of the PSF. The variability of uncertainty of this value was obtained by considering five PSFs.

Fig. 6

PSF for a 1  μm fluorescent bead located in the center of the FOV. Crosses indicate experimental data points and the dashed line corresponds to the Gaussian fit. The black box indicates the FWHM.

JBO_24_3_036501_f006.png

Average resolutions of 3.72±0.25  μm and 3.98±0.10  μm were obtained in the center of the FOV and on the edges, respectively. The theoretical limit of the resolution, which results from diffraction,29 is r=0.61λNA2.74  μm at 507 nm (i.e., at the central emission wavelength of GFP). This indicated that the system’s resolution was primarily limited by the optics.

The level of noise in a fluorescence image is crucial because it confuses the signal of interest. The occurrence of noise has thus been assessed and a postprocessing method has been chosen to reduce artifacts in the images. When exposure time was long, typically >500  ms, a median two-by-two filter was applied. To correct for the inhomogeneous excitation (Gaussian laser excitation profile), the image was normalized by a reference excitation profile. To enhance the image quality, the fluorescence image was then deconvolved by the experimental PSF using a Lucy–Richardson algorithm.

3.3.

Sensitivity Measurements

For applications in biology, it is critical to evaluate the imaging system’s sensitivity to fluorescence. The sensitivity of our system was positioned relative to flow cytometry (CyAn™ ADP, Beckman Coulter Inc.); the channel was fluorescein isothiocyanate (FITC) PMT 650 and the gain was set to 1. For calibration, we used a solution of 6.0- to 6.4-μm-diameter beads with six different fluorescence levels (Rainbow Calibration Particles No. RCP-60-5, Spherotech Inc.). These beads are typically used for fluorescence quality control of flow cytometers.

The flow cytometer was used with a 488-nm laser line coupled to a 530/40  nm bandpass filter. The laser output power of 20 mW was spread over an area <0.015  mm2. The cytometry graph in Fig. 7(c) shows the number of detected beads as a function of the detected fluorescence intensity for the FITC fluorophore. The six groups of fluorescent beads were detected.

Fig. 7

Detection of fluorescent Sphero™ beads: (a) details of a raw acquisition by our system showing beads of three different fluorescence levels with an integration time of 1000 ms and a gain of 1, (b) histogram of the levels of fluorescence detected by our system with an integration time of 1000 ms and an analog gain of 8.52, and (c) histogram of the levels of fluorescence detected by the flow cytometer on FITC PMT 650 channel with a gain of 1.

JBO_24_3_036501_f007.png

Performances of our system were assessed in comparison to flow cytometry. This was done by integrating a 488-nm laser combined with the appropriate emission filter (FF01-531/40, Semrock Inc.) into our system. The laser had an output power of 25-mW spread over an area >30  mm2. An integration time of 1 s and an analog gain of 8.52 were used for calibration. The raw acquisition of the beads and the histogram of the mean gray value of the detected beads are shown in Figs. 7(a) and 7(b), respectively. The mean gray level of each bead is a direct measurement of its fluorescence intensity. The three brightest bead groups could be detected.

With this calibration, we obtained quantitative information on the sensitivity of the mesoscope with a specific fluorescence excitation source. We can be confident that high quantum yield fluorescence markers, in particular nuclear and membrane markers, will be detected by our system. Furthermore, we showed that we were able to discriminate bead populations based on their fluorescence levels.

4.

Biological Samples

The developed setup is an easy-to-use and cost-effective solution for field use. Combining phase and fluorescence imaging on the same FOV is of interest for many biological applications.4 We chose two specific examples to illustrate the advantages brought by an acquisition system combining a large FOV with a micrometric resolution.

4.1.

Statistical and Rare Event Imaging

Assessing the ability of our system to perform bimodal statistical imaging of small objects is of particular interest. Recent studies have demonstrated the potential of bimodal imaging techniques for the diagnosis of meningitis from counting blood cells in cerebrospinal fluid40 or from counting red blood cells (RBCs), white blood cells (WBCs), or platelets in whole blood.41 To demonstrate the high-throughput performances of the system, we performed WBC counting in whole blood samples. In healthy blood, the ratio of WBC to RBC is roughly 0.1%.

Nucleated WBCs were specifically labeled with Thiazole Orange dye (390063, Sigma-Aldrich Inc.). This nucleic acid marker is known to have a high quantum yield.42 A fluorescent labeling solution was obtained by dissolving Thiazole Orange in methanol at a concentration of 1  mg/mL and it was then diluted to 0.2  μg/mL in phosphate buffered saline (PBS) (D8537, Sigma-Aldrich Inc.). About 1  μL of whole blood was incubated for 30 min in 1 mL of this solution. Three slides (CV 1100-2cv, CellVision Inc.), each having two chambers, were filled with 25  μL of the solution.

A hologram and the corresponding fluorescence image were acquired with our system. We considered two FOVs per chamber, and hence we had 12 different FOVs for this experiment. A total number of 24,000±2000 cells (WBCs and RBCs) were detected in each FOV, owing to their phase contrast, as shown in Figs. 8(a)8(b). This number of cells allowed us to infer statistically significant results from the acquired data. The detection and classification were achieved using automatic custom-developed algorithms. In the phase images, platelets and nonblood cell objects such as dust could be discriminated based on their size. Labeled WBCs could be counted on the fluorescence image, as shown in Fig. 8(c). To sum up, two images were required to differentiate the WBCs from the RBCs: one in phase and one in fluorescence. The sum of the integration times was below 2 s.

Fig. 8

Multimodal acquisition of whole blood labeled with Thiazole Orange. Full-field images and details of (a) a hologram, (b) the reconstructed phase map, and (c) the fluorescence acquisition.

JBO_24_3_036501_f008.png

The ratio WBCWBC+RBC obtained was 0.159%±0.026%. The error range was calculated from the variance of the results obtained for the 12 FOVs. To check the validity of our approach, we compared this result with the routine complete blood count examination performed using flow cytometry (Sysmex Corp., Japan) and shown in Table 1.

Table 1

Results of the complete blood count examination.

WBCRBC
8.0·109/L4.6·1012/L

The ratio of interest obtained was: WBCWBC+RBC=0.148%. This is within the error range of our measurement. This result suggests that the developed system is capable of performing a WBC count.

4.2.

Multimodal Cell Imaging

We additionally explored the ability of our system to image cell cultures with phase and fluorescence contrasts. In particular, we studied fixed U-2 OS cells (human osteosarcoma).

U-2 OS cells were plated and incubated overnight on 18-mm coverslips coated with poly-L-lysin in a 12-well plate in McCoy medium (Gibco) supplemented with 10% Fetal Bovine Serum. They were fixed with 4% paraformaldehyde in PBS for 10 min, permeabilized with 0.5% triton in PBS for 10 min, and rinsed three times with PBS. They were then incubated overnight with 165-nM AlexaFluor 488 phalloidin (Invitrogen Molecular Probes) at 4°C in a humidified chamber in the dark. They were rinsed three times with PBS, then stained with 5  μg/mL Hoechst 33342 (Invitrogen Molecular Probes) for 6 min. They were rinsed two times for 10 min with PBS and then quickly with ultrapure water to prevent the formation of salt crystals and were then mounted on a slide with Fluoromount-G medium (Sigma-Aldrich). They were stored at 4°C overnight before imaging.

Owing to the significant phase contrast of the cell nucleus, cells could be detected on the phase image. In particular, if membranes of two contiguous cells were superposed, thresholding the phase signal was enough to detect individual nuclei. Consequently, cells could be counted, as illustrated in Fig. 9(c). This information was qualitative and did not require having a quantitative phase contrast. In the example shown, 1219 cells were detected in the FOV.

Fig. 9

(a) Wide-field segmented fluorescence image of U-2 OS cells. The fluorophore Alexa Fluor 488 phalloidin labels the actin filaments. (b) Details of (a) showing the adhesion surface segmentation for several cells. (c) Same FOV as (b), combined segmentation of the adhesion surface (in green) and the nuclei (in yellow). The nuclei were segmented from the phase reconstruction.

JBO_24_3_036501_f009.png

The actin filaments of U-2 OS cells were labeled with Alexa Fluor 488 phalloidin. As illustrated in Figs. 9(a)9(b), the fluorescence of the cell membrane allowed for segmentation of the surface on which it was adhered to the glass slide. The phase image was not self-consistent because of the low phase contrast of the cell extensions. The use of phase rather than a second fluorescent marker for the detection of cell nuclei simplifies the sample preparation and reduces the cost or leaves the possibility to label other structures of interest. For the accurate segmentation of cells, the proposed bimodal tool is more thorough than the study of a single imaging modality. The segmentation on the fluorescent and phase images was performed using ImageJ.43 Owing to the common path setup, the fluorescence and phase segmentations were accurately registered without the need for numerical operations, as illustrated in Fig. 9(c).

To demonstrate the possibility of performing multiwavelength fluorescence imaging, we imaged U-2 OS cells labeled with Hoechst 33342 and Alexa Fluor 488 phalloidin, which stains the DNA (thus, mostly the nucleus) and the actin filaments, respectively. The green fluorescence was acquired under 520-nm excitation (L520P50, Thorlabs Inc.) coupled to a 561-nm high-pass fluorescence filter (BLP02-561R-25, Semrock Inc.) and the blue fluorescence was acquired under 405-nm excitation (L405P20, Thorlabs Inc.) coupled to a 461-nm high-pass fluorescence filter (03FCG461, Melles Griot Inc.). The images shown in Figs. 10(a)10(c) are a color merge of the two sequentially acquired images. As the same FOVs were imaged, no numerical registration was required. This image shows the optical performances of the system: subcellular resolution on a large FOV.

Fig. 10

Wide-field bicolor fluorescence data of U-2 OS cells shown as a color merge of sequentially acquired images. The blue fluorescence reveals the DNA in the nucleus stained with Hoechst 33342 and the actin filaments are green because of Alexa Fluor 488 phalloidin. (d)–(f) Corresponding phase images obtained from holographic reconstruction of a single-defocus image at 1387  μm. The red boxes in (a) and (d) correspond to the zoomed images in (b) and (e), respectively. The yellow boxes in (b) and (e) correspond to the zoomed images in (c) and (f), respectively.

JBO_24_3_036501_f010.png

These results suggest that it is possible to access both the morphology of individual cells and the cell population statistics. This might be of interest for the study of the cell adhesion process, i.e., the mechanical interaction between a cell and an extracellular matrix, artificial or not. It is of interest in tissue engineering and biomaterial design to study cell growth and motion.44

5.

Discussion and Conclusion

We have shown here an experimental setup to perform phase and fluorescence imaging over a 30-mm2 FOV without the need for postregistration of images. The system has a unique detection pathway and two separate illumination modules. The detection pathway included a consumer lens and an industrial CMOS sensor. Semicoherent brightfield illumination was achieved by using a low-cost LED, and fluorescence excitation used either a LED or a LD. The phase map of the sample could be numerically retrieved from a single out-of-focus image. The figures of merit of the optical system were experimentally characterized. In particular, the multimodal resolution was found to be 3 to 4  μm. The FOV of our 1× magnification system allowed performing statistical studies of cells. This can be seen as a high-throughput45 single-shot imaging system. We demonstrated the capabilities of our system for statistical study of blood cells and adherent cell imaging with the goal of laying some groundwork for potential future studies in cell biology and hematology fields.

We showed that with the single shot in-line holography method, a nonquantitative phase contrast can be obtained. Yet quantitative phase information is of interest for applications in cell biology.3 The system we described in this paper is compatible with the quantitative phase reconstructions from both multiacquisition in-line holography33,34 and transport of intensity equation method.10

Our system was developed as a proof of concept and could be further optimized for specific applications. The effective power of the fluorescence excitation sources used in this work was 5 to 25 mW, and the applications that can be addressed use high quantum yield and highly concentrated fluorescent probe. The excitation source in our setup could be increased by several orders of magnitude for the system to match the excitation power of cytometers. Sensitivity limit would be optimized. Hence, a wider range of applications could be targeted, e.g., detection of antibody-labeled platelets or immature RBCs stained with Thiazole Orange. Interestingly, we know from cytometry that photobleaching would not be an issue for single-shot acquisitions under high fluorescence excitation powers.

Another possible optimization concerns the effective FOV of the system. The ratio of the effective FOV by pixel area did not match the available SBP, which was much higher than that of the microscopy objectives (Fig. 3). It was limited by the sensor area, which was smaller than the accessible FOV. The available SBP indicated that the system could be improved to image larger FOVs without sacrificing resolution. The configuration would be optimized by the use of a larger sensor with small pixel size. Such sensors might become available in the future. A larger effective FOV may also be achieved by mechanical scanning and stitching at the expense of simplicity and speed. This would enable higher throughput imaging and detection of extremely rare events such as parasites or quantification of extremely low-level platelet detection (<1%) to diagnose severe bacterial infections.

Statistical studies of cells in hematology are routinely performed in flow cytometry. Such studies generally focus on the estimation of parameters related to the cell size, its structure, and its fluorescence level.46 Flow cytometers are typically capable of studying 10,000 to 100,000 cells/s, but measurements are made for individual objects, resulting in bulky and expensive systems. Another technique is based on hemocytometer chambers but is usually limited to smaller statistics and might lack sensitivity to detect very low blood cell numbers. Here, we showed that our system could perform wide-FOV phase and fluorescence imaging on 10,000 to 25,000 blood cells in a single frame. Performance of our system in comparison to flow cytometry remains to be assessed both in terms of statistics and determination of cell morphology. The former holds numerous advantages, as instrumentation can be simpler, more affordable, and adapted to point of care.

The system described in this work was developed for cell population analysis. Our aim is to propose a single-shot imaging alternative to flow cytometry for specific applications in hematology. We believe in this case that phase contrast and diffraction flow cytometry can provide similar information, whereas fluorescence imaging gives indications comparable to fluorescence flow cytometry. However, spatial resolution brings additional information that cannot be obtained by conventional flow cytometry, such as cell morphology, spatial distribution, or the intracellular location of biomolecules.47

Disclosures

The authors have no potential conflicts of interest to disclose.

References

1. 

D. N. Breslauer et al., “Mobile phone based clinical microscopy for global health applications,” PLoS One, 4 e6320 (2009). https://doi.org/10.1371/journal.pone.0006320 POLNCL 1932-6203 Google Scholar

2. 

M. Mir et al., “Quantitative phase imaging,” Progress in Optics, 57 133 –217 Elsevier Science, Amsterdam (2012). Google Scholar

3. 

S. Aknoun et al., “Living cell dry mass measurement using quantitative phase imaging with quadriwave lateral shearing interferometry: an accuracy and sensitivity discussion,” J. Biomed. Opt., 20 126009 (2015). https://doi.org/10.1117/1.JBO.20.12.126009 JBOPFO 1083-3668 Google Scholar

4. 

P. Y. Liu et al., “Cell refractive index for cell biology and disease diagnosis: past, present and future,” Lab Chip, 16 634 –644 (2016). https://doi.org/10.1039/C5LC01445J LCAHAM 1473-0197 Google Scholar

5. 

F. Zernike, “Phase contrast, a new method for the microscopic observation of transparent objects part II,” Physica, 9 974 –986 (1942). https://doi.org/10.1016/S0031-8914(42)80079-8 Google Scholar

6. 

U. Schnars and W. P. Lawrence, “Direct recording of holograms by a CCD target and numerical reconstruction,” Appl. Opt., 33 179 –181 (1994). https://doi.org/10.1364/AO.33.000179 APOPAI 0003-6935 Google Scholar

7. 

G. Popescu et al., “Fourier phase microscopy for investigation of biological structures and dynamics,” Opt. Lett., 29 2503 (2004). https://doi.org/10.1364/OL.29.002503 OPLEDP 0146-9592 Google Scholar

8. 

G. Popescu et al., “Diffraction phase microscopy for quantifying cell structure and dynamics,” Opt. Lett., 31 775 (2006). https://doi.org/10.1364/OL.31.000775 OPLEDP 0146-9592 Google Scholar

9. 

M. R. Teague, “Deterministic phase retrieval: a Green’s function solution,” J. Opt. Soc. Am., 73 1434 –1441 (1983). https://doi.org/10.1364/JOSA.73.001434 JSDKD3 Google Scholar

10. 

D. Paganin and K. A. Nugent, “Noninterferometric phase imaging with partially coherent light,” Phys. Rev. Lett., 80 2586 –2589 (1998). https://doi.org/10.1103/PhysRevLett.80.2586 PRLTAO 0031-9007 Google Scholar

11. 

Z. Wang et al., “Spatial light interference microscopy (SLIM),” Opt. Express, 19 1016 –1026 (2011). https://doi.org/10.1364/OE.19.001016 OPEXFF 1094-4087 Google Scholar

12. 

P. Bon et al., “Optical detection and measurement of living cell morphometric features with single-shot quantitative phase microscopy,” J. Biomed. Opt., 17 076004 (2012). https://doi.org/10.1117/1.JBO.17.7.076004 JBOPFO 1083-3668 Google Scholar

13. 

D. Gabor, “A new microscopic principle,” Nature, 161 777 –778 (1948). https://doi.org/10.1038/161777a0 Google Scholar

14. 

A. Greenbaum et al., “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods, 9 889 –895 (2012). https://doi.org/10.1038/nmeth.2114 1548-7091 Google Scholar

15. 

J. Sheng, E. Malkiel and J. Katz, “Digital holographic microscope for measuring three-dimensional particle distributions and motions,” Appl. Opt., 45 3893 (2006). https://doi.org/10.1364/AO.45.003893 APOPAI 0003-6935 Google Scholar

16. 

Y. Park et al., “Diffraction phase and fluorescence microscopy,” Opt. Express, 14 8263 –8268 (2006). https://doi.org/10.1364/OE.14.008263 OPEXFF 1094-4087 Google Scholar

17. 

J. W. Lichtman and J.-A. Conchello, “Fluorescence microscopy,” Nat. Methods, 2 910 –919 (2005). https://doi.org/10.1038/nmeth817 1548-7091 Google Scholar

18. 

S. A. Arpali et al., “High-throughput screening of large volumes of whole blood using structured illumination and fluorescent on-chip imaging,” Lab Chip, 12 4968 (2012). https://doi.org/10.1039/c2lc40894e LCAHAM 1473-0197 Google Scholar

19. 

S. Chowdhury et al., “Spatial frequency-domain multiplexed microscopy for simultaneous, single-camera, one-shot, fluorescent, and quantitative-phase imaging,” Opt. Lett., 40 4839 (2015). https://doi.org/10.1364/OL.40.004839 OPLEDP 0146-9592 Google Scholar

20. 

M. Habaza et al., “Tomographic phase microscopy with 180° rotation of live cells in suspension by holographic optical tweezers,” Opt. Lett., 40 1881 (2015). https://doi.org/10.1364/OL.40.001881 OPLEDP 0146-9592 Google Scholar

21. 

S. Chowdhury et al., “Structured illumination multimodal 3D-resolved quantitative phase and fluorescence sub-diffraction microscopy,” Biomed. Opt. Express, 8 2496 –2518 (2017). https://doi.org/10.1364/BOE.8.002496 BOEICL 2156-7085 Google Scholar

22. 

V. Dubey et al., “Multi-modal chip-based fluorescence and quantitative phase microscopy for studying inflammation in macrophages,” Opt. Express, 26 19864 (2018). https://doi.org/10.1364/OE.26.019864 OPEXFF 1094-4087 Google Scholar

23. 

G. McConnell et al., “A novel optical microscope for imaging large embryos and tissue volumes with sub-cellular resolution throughout,” eLife, 5 e18659 (2016). https://doi.org/10.7554/eLife.18659 Google Scholar

24. 

A. Forcucci et al., “All-plastic, miniature, digital fluorescence microscope for three part white blood cell differential measurements at the point of care,” Biomed. Opt. Express, 6 4433 (2015). https://doi.org/10.1364/BOE.6.004433 BOEICL 2156-7085 Google Scholar

25. 

A. Orth and K. B. Crozier, “High throughput multichannel fluorescence microscopy with microlens arrays,” Opt. Express, 22 18101 (2014). https://doi.org/10.1364/OE.22.018101 OPEXFF 1094-4087 Google Scholar

26. 

S. Pang et al., “Wide field-of-view Talbot grid-based microscopy for multicolor fluorescence imaging,” Opt. Express, 21 14555 –14565 (2013). https://doi.org/10.1364/OE.21.014555 OPEXFF 1094-4087 Google Scholar

27. 

D. Jin et al., “Compact wireless microscope for in-situ time course study of large scale cell dynamics within an incubator,” Sci. Rep., 5 18483 (2015). https://doi.org/10.1038/srep18483 SRCEC3 2045-2322 Google Scholar

28. 

A. F. Coskun et al., “Lensfree fluorescent on-chip imaging of transgenic caenorhabditis elegans over an ultra-wide field-of-view,” PLoS One, 6 e15955 (2011). https://doi.org/10.1371/journal.pone.0015955 POLNCL 1932-6203 Google Scholar

29. 

M. Born and E. Wolf, Principles of Optics, Cambridge University Press, Cambridge (1999). Google Scholar

30. 

J. W. Goodman, Introduction to Fourier Optics, McGraw-Hill, New York (1968). Google Scholar

31. 

L. Hervé and C. Allier, “Method for observing a sample, by calculation of a complex image,” Patent No. WO/2017/162985 (2017).

32. 

J. M. Huntley and H. Saldner, “Temporal phase-unwrapping algorithm for automated interferogram analysis,” Appl. Opt., 32 3047 –3052 (1993). https://doi.org/10.1364/AO.32.003047 APOPAI 0003-6935 Google Scholar

33. 

C. J. Mann et al., “Quantitative phase imaging by three-wavelength digital holography,” Opt. Express, 16 9753 (2008). https://doi.org/10.1364/OE.16.009753 OPEXFF 1094-4087 Google Scholar

34. 

L. J. Allen and M. P. Oxley, “Phase retrieval from series of images obtained by defocus variation,” Opt. Commun., 199 65 –75 (2001). https://doi.org/10.1016/S0030-4018(01)01556-5 OPCOB8 0030-4018 Google Scholar

35. 

A. W. Lohmann et al., “Space-bandwidth product of optical signals and systems,” J. Opt. Soc. Am. A, 13 470 (1996). https://doi.org/10.1364/JOSAA.13.000470 JOAOD6 0740-3232 Google Scholar

36. 

G. Zheng, R. Horstmeyer and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics, 7 739 –745 (2013). https://doi.org/10.1038/nphoton.2013.187 NPAHBY 1749-4885 Google Scholar

37. 

P. J. de Groot and J. F. Biegen, “A new class of wide-field objectives for 3D interference microscopy,” Proc. SPIE, 9525 95250N (2015). https://doi.org/10.1117/12.2183628 PSISDG 0277-786X Google Scholar

38. 

P. Petruck, R. Riesenberg and R. Kowarschik, “Partially coherent light-emitting diode illumination for video-rate in-line holographic microscopy,” Appl. Opt., 51 2333 (2012). https://doi.org/10.1364/AO.51.002333 APOPAI 0003-6935 Google Scholar

39. 

A. Descloux et al., “Combined multi-plane phase retrieval and super-resolution optical fluctuation imaging for 4D cell microscopy,” Nat. Photonics, 12 165 –172 (2018). https://doi.org/10.1038/s41566-018-0109-4 NPAHBY 1749-4885 Google Scholar

40. 

Y. Lee, B. Kim and S. Choi, “On-chip cell staining and counting platform for the rapid detection of blood cells in cerebrospinal fluid,” Sensors, 18 1124 (2018). https://doi.org/10.3390/s18041124 SNSRES 0746-9462 Google Scholar

41. 

D. Xie et al., “Performance of a cost-effective and automated blood counting system for resource-limited settings operated by trained and untrained users,” J. Biophotonics, 11 e201700030 (2017). https://doi.org/10.1002/jbio.201700030 Google Scholar

42. 

L. G. Lee, C.-H. Chen and L. A. Chiu, “Thiazole orange: a new dye for reticulocyte analysis,” Cytometry, 7 508 –517 (1986). https://doi.org/10.1002/(ISSN)1097-0320 CYTODQ 0196-4763 Google Scholar

43. 

V. Wiesmann et al., “Review of free software tools for image analysis of fluorescence cell micrographs,” J. Microsc., 257 39 –53 (2015). https://doi.org/10.1111/jmi.12184 JMICAR 0022-2720 Google Scholar

44. 

M. R. King, Principles of Cellular Engineering: Understanding the Biomolecular Interface, Elsevier Academic Press, Burlington (2006). Google Scholar

45. 

D. K. Singh et al., “Label-free, high-throughput holographic screening and enumeration of tumor cells in blood,” Lab Chip, 17 2920 –2932 (2017). https://doi.org/10.1039/C7LC00149E LCAHAM 1473-0197 Google Scholar

46. 

A. Adan et al., “Flow cytometry: basic principles and applications,” Crit. Rev. Biotechnol., 37 163 –176 (2017). https://doi.org/10.3109/07388551.2015.1128876 CRBTE5 0738-8551 Google Scholar

47. 

D. A. Basiji et al., “Cellular image analysis and imaging by flow cytometry,” Clin. Lab. Med., 27 653 –670 (2007). https://doi.org/10.1016/j.cll.2007.05.008 Google Scholar

Biographies of the authors are not available.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Isaure de Kernier, Anaïs Ali-Cherif, Nelly Rongeat, Olivier Cioni, Sophie Morales, Julien Savatier, Serge Monneret, and Pierre Blandin "Large field-of-view phase and fluorescence mesoscope with microscopic resolution," Journal of Biomedical Optics 24(3), 036501 (9 March 2019). https://doi.org/10.1117/1.JBO.24.3.036501
Received: 14 December 2018; Accepted: 6 February 2019; Published: 9 March 2019
Lens.org Logo
CITATIONS
Cited by 11 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Luminescence

Imaging systems

Image resolution

Blood

Sensors

Microscopy

Objectives

Back to Top