3 November 2014 <italic<In vivo</italic< near-infrared fluorescence three-dimensional positioning system with binocular stereovision
Author Affiliations +
J. of Biomedical Optics, 19(11), 116002 (2014). doi:10.1117/1.JBO.19.11.116002
Fluorescence is a powerful tool for <italic<in-vivo</italic< imaging in living animals. The traditional <italic<in-vivo</italic< fluorescence imaging equipment is based on single-view two-dimensional imaging systems. However, they cannot meet the needs for accurate positioning during modern scientific research. A near-infrared <italic<in-vivo</italic< fluorescence imaging system is demonstrated, which has the capability of deep source signal detecting and three-dimensional positioning. A three-dimensional coordinates computing (TDCP) method including a preprocess algorithm is presented based on binocular stereo vision theory, to figure out the solution for diffusive nature of light in tissue and the emission spectra overlap of fluorescent labels. This algorithm is validated to be efficient to extract targets from multispectral images and determine the spot center of biological interests. Further data analysis indicates that this TDCP method could be used in three-dimensional positioning of the fluorescent target in small animals. The study also suggests that the combination of a large power laser and deep cooling charge-coupled device will provide an attractive approach for fluorescent detection from deep sources. This work demonstrates the potential of binocular stereo vision theory for three-dimensional positioning for living animal <italic<in-vivo</italic< imaging.
Song, Jin, Wang, Jin, and Mu: In vivo near-infrared fluorescence three-dimensional positioning system with binocular stereovision



In-vivo optical sensing technique has been proved to be an attractive technique.12.3 It has the capability of performing quantitative and qualitative studies on a cellular or molecular level.45.6 Fluorescence is a versatile and useful tool for living animal in-vivo imaging.7 Fluorescence is the emission of light from an excitation source. Fluorescence occurs when an orbital electron of a molecule relaxes to its ground state by emitting a photon of light after being excited to a higher quantum state by some type of energy.8 The advantage of in-vivo optical imaging is cost-effective and easy to operate.9 The photon absorption rate of most biological tissues is comparatively low in the near infrared (NI) spectral range (650–900 nm), and photons can be detected through living organs.10 So far, many near-infrared range (NIR) fluorescent probes have been developed (such as quantum dots) for in-vivo imaging studies.1112.13.14.15

A single-view two-dimensional (2-D) imaging system is the mainstream of in-vivo fluorescence imaging equipment.1617.18.19 However, they cannot meet the needs for accurate positioning during the studies. A three-dimensional (3-D) positioning can provide more accurate target location information,20 which shows a great significance for biological and medical applications. Researchers can observe the target spatial position, analyze the target metastasis, and determine the relative positional relationship between the target and the organs. In this paper, a NI in-vivo fluorescence imaging system is demonstrated, and a method for computing the 3-D coordinates of targets based on binocular stereovision theory is presented.

Generally, binocular stereo techniques can be divided into the following parts: image acquisition, camera calibration, feature extraction, image matching and 3-D positioning.21,22 It has been applied into the fields of computer vision,23 photogrammetry,24,25 and experimental mechanic.26,27 In fluorescence imaging in-vivo, the diffusive nature of light in tissue28,29 and the emission spectra overlap of fluorescent labels pose great challenges for 3-D positioning using binocular stereovision theory. Thus, an algorithm is necessary to extract targets from the multispectral image (MSI) and determine the spot center of biological interest.

Hu et al.30 proposed a novel technique that combined with binocular stereovision and fluorescent imaging for 3-D surface profilometry and deformation measurement, which enable the noncontact, full-field measurements in biotissue and biomaterial at the microscale. In this paper, the similar technique is applied, and furthermore, a novel system is developed for in-vivo 3-D positioning. A rotary platform is used instead of two cameras to reduce the dimension and cost of the system. The novel system can detect fluorescent signals from deep sources instead of surface information. In addition, the novel system presented is powerful that can perform in-vivo imaging for NI applications.

The fluorescence sensing system includes a high-intensity, narrow-bandwidth excitation source, and high-sensitive photon detection. The relatively low quantum yield is the key limitation for penetration and use of the in-vivo optical sensing technique. In this paper, a large power diode pump solid state laser (671 nm, 2 W) with short pulse duration (10nm) is used as the excitation source. A deep cooling charge-coupled device (CCD) with large pixel size (minimum 100°C, pixel size 13μm) is used to detect the weak fluorescent signals at NI spectral range (quantum efficiency (QE) is >90% around NIR). This system is elaborately optimized and is capable of detecting weak fluorescent signals of deep targets with high sensitivity.

In this paper, a NI in-vivo fluorescence imaging system and 3-D positioning algorithm based on binocular stereovision theory are presented. Experiments are designed to demonstrate the validity of the algorithm. The results and further data analysis prove that this method could be used in 3-D positioning of the target in the small animal. This system also shows the ability to detect fluorescent signals from deep sources.




Experimental Setup

The in-vivo fluorescence sensing system consists of a darkroom, a high-energy diode pump solid-state laser, a liquid-core fiber that can divide one beam of light to four, emission filter wheel, rotary platform module, a deep cooling high-sensitive CCD, and a computer in which 3-D positioning software runs on (see Fig. 1).

Fig. 1

A schematic of the in vivo near-infrared fluorescence three-dimensional positioning system.


Figure 1 illustrates the schematics and key components of the in-vivo fluorescence sensing system. The darkroom provides light-confined environment, and external light is shielded. High-energy laser works as an excitation source to illuminate the small animal uniformly through liquid-core fiber. A series of LEDs are utilized for bright field illumination. Deep cooling CCD, focusing lens, and emission filter wheel are used to acquire high-sensitive multispectral image data at NIR.

The block diagram of the in-vivo fluorescence sensing system is shown in Fig. 2. Liquid-core fiber is coupled with laser output. Four outputs of the liquid-core fiber are located above the small animal to illuminate it uniformly from four directions. The axis of the optical imaging path intersects the center of the rotating platform with a 45-deg angle. With the assistance of anesthesia apparatus, small animal is very stable while rotating on the accurate rotary platform. High-sensitive deep cooling CCD can reduce the dark current noise greatly, with large pixel size and high QE around 650–900 nm, and it is amenable to acquire very weak fluorescent signal at NIR.

Fig. 2

Block diagram of the system.


The excitation laser source is a large power diode pump solid-state laser (MRL-N, 2W, New Industries Optoelectronics Technology Co., Ltd. Changchun, China) emitting at 671 nm, with short pulse duration (10nm), instead of an xenon lamp, for the xenon lamp will require long warm-up time and a 10%-50% changes of light intensity in working conditions.31 In the designed system, a uniform excitation light for small animals is necessary, and a solid state laser source only has 0.1% light intensity changes.32 The laser beam is transmitted via a liquid-core fiber (numerical aperture: 0.52). Four outputs of the liquid-core fiber are located above the small animal to illuminate the small animal uniformly from four directions. Liquid-core optical fiber has high numerical aperture,33 which is an important parameter for optical fiber for large numerical aperture that represents high-coupling efficiency.34,35 Liquid-core optical fiber also has good flexibility and reliability.36

The optical detection unit consists of a deep cooling CCD, lens, and filter wheel. The fluorescence light transmits through lens, filters sequentially, and is detected by CCD. The central element of the detection unit is a 1024×1024 deep cooling 16-bit CCD camera (DU-934NBRD, Andor Technology, Belfast, Ireland). The camera is designed for NI applications.37 It has very high QE at NIR (>90%). QE is a measurement of a device’s electrical sensitivity to light at each photon energy level.38 High QE means more incident photons to converted electrons.39 This camera can detect extremely weak NI fluorescence signals. With deep cooling features, the read out noise is extremely low [2.5e (50 kHz)]. Dark current is one of the main sources for noise in image sensors such as CCDs; these can be greatly reduced by deep cooling [0.008e/pixel/s (100°C)].40 In the system, the filter is placed between the lens and CCD. A customized focusing lens is fabricated to meet the demands, and the focal length and field of view can be adjusted. The filter wheel is fully enclosed, light-tight. A total of 12 filter holes, 11 band-pass filters, and an all-pass piece (K9 glass) are installed. The K9 glass is used in the bright field illumination. The center wavelengths of the filters are distributed evenly in the range of “NI windows.” The filters were purchased from China-Quantum Co., Ltd., Changchun, China. The full-width half maximum of the filters is around 10 nm (9.2 nm minimum to 11.6 maximum). Filters currently in use are 681, 702, 711, 759, 780, 794, 808, 850, 880, 903, and 938 nm. The CCD is connected to a common personal computer via USB 2.0, and image acquisition was done with the software Andor SOLIS 4.7.3 provided by the manufacturer of the CCD.

The darkroom provides a light-tight experiment environment and fixes components of the imaging system. The inner surface is coated with black extinction paint. The reflectivity is <0.5%. The possibility of light leakages from the outside such as fiber, anesthetic gas transmission pipes, wires, etc., is greatly reduced by double sealed design. When capturing images using this system, the first step is to open the LEDs, adjust the field of view, and calibrate the CCD camera. The fluorescent-labeled small animal is placed on the rotary platform. Under the LED bright field illumination, the all-pass filter will capture an image of the small animal. After that, the LEDs are closed, and the laser is opened; the filters and rotary platform are changed in order to obtain multispectral fluorescence images from different perspectives. Images are subsequently processed and analyzed with customized software running on the computer.


Target Extraction Based on Multispectral Imaging

In NI fluorescence in-vivo imaging, the target is captured as a large light spot. Therefore, the center of the target spot needs to be determined for 3-D positioning.


Multispectral imaging

By switching different filters, the spectroscopic information is obtained in image form.41 With different wavelengths, the entire emission for every pixel of the entire image in the field of view can be recorded. The MSI provides a “data cube” of spectral information of the entire image at each wavelength of interest.42 It has the capability to acquire spatial and spectral information of the entire small animal at NI range.43 Comparing the spectral differences between different fluorescent probes and background signals, target spots can be extracted. Besides 2-D information, MSI also provides the spectral dimension as z-axis.44 As emission information at each specific wavelength are plotted on one coordination space, a multispectral data cube is formed. In the data cube, f(x,y,z) represents the gray value of the pixel corresponding to the coordinates (x,y,z). The normalized value of the pixel’s grayscale is



Spectral curve is reconstructed by plotting f (x,y,z) at each wavelength. Each image contains both spatial (x, y coordinate data) and spectral (z-intensity data) information. fmin(x,y,z) and fmax(x,y,z) represent the minimum and maximum of the sample data, respectively. Getting spectral curve is the most intuitive way of expressing spectral features. The cubic spline interpolation method is introduced to reconstruct the curve.


Multispectral unmixing

When capturing in-vivo fluorescent images, the sensitivity is restricted by autofluorescence from skin and organ. This limitation makes it difficult to locate the fluorophores of interest accurately. At the same time, it is necessary to monitor a variety of biological processes simultaneously. It is necessary to use multiple fluorescent probes to label different molecules. Multispectral unmixing can remove autofluorescence and separate multiple fluorophores of interest. The unmixing algorithm assumes that the fluorescence spectrum measured is a stack up of several pure spectra multiplied by a weighting factor. The weighing factor is determined by the local concentration, excitation efficiency, and relative luminance of the emission fluorescence. The linear model is described as below:


where M represents the measured spectra data matrix for each pixel (m columns: m pixels; n rows: n wavelengths). S represents the pure spectra data matrix of each fluorophores of interest and autofluorescence (k columns: k fluorophores; n rows: n wavelengths). A represents the weighing factor matrix (m columns: m pixels; k rows: k fluorophores). M is the measured multispectral data. R represents the matrix error. To solve this linear equation, the following conditions must be satisfied. The spectral detection channels must be greater than the number of fluorophores in the sample. Autofluoresence must also be calculated as a fluorophore. The pure spectra of fluorophores need to be obtained beforehand. This equation can be calculated by least squares method.



The pure spectra of fluorophores can be obtained from the spectral library directly or extracted from multispectral fluorescence images.


Target spot center determination

The mathematical model of fluorescent spot energy distribution is as below:


where Emax is the maximum light intensity, (xc,yc) is the coordinates of the fluorescent spot energy center, and ax, ay are the major and minor axes of spot intensity distribution. The image grayscale distribution function of the light spot on the camera imaging plane is


where h1, h2 are the row vectors of H1, H=MR. M is internal matrix of camera. R is the rotation matrix. P is the homogeneous coordinates of spot center. The pixel at the imaging plane corresponding to the energy center of the spot is the maxima point of the grayscale distribution function.

At gray image I(i,j), the gray weighted center (x0,y0) is


where W(i,j) is the weights, W(i,j)=I(i,j) in general. The method chooses gray gravity method to process the fluorescence spot after multispectral unmixing.


3-D Positioning Based on Binocular Stereovision Theory

Vision measurement system is based on the image information acquired by the camera to calculate the 3-D position of the target. The paper presents a method to obtain 3-D coordinates of the fluorescence targets based on binocular stereovision theory.


Camera calibration

The correspondence between the 3-D spatial position of a point and its 2-D plane position is determined by the geometry of the camera. The parameters of the geometric model are called the camera parameters. Camera calibration is the process of calculating camera parameters by experiments that determines the internal and external parameters of the camera. In this paper, OpenCV functions45 are used to calculate internal parameters matrix M based on Zhang.46 In the stereo vision system, it is necessary to access the relative position of the two cameras (or a single camera at different perspectives). The distortion parameter D is solved by Brown’s method.47


Binocular stereovision

The basic principle of stereo vision is used to observe the same target from two (or more) points of view. Images are acquired under different perspectives. The 3-D information about the target is solved via the triangulation principle. In this study, the small animal is placed on the rotary platform, and the fixed CCD camera takes photos on different rotary angles. Since the rotary platform and the CCD camera move simultaneously, the process is also under view as the CCD camera rotating around the small animal, while the small animal is fixed. Thus, the equivalent convergent stereoscopic model can be obtained. It can be converted to parallel stereoscopic models for further computing.48 The modules of the rotary platform, the equivalent convergent stereoscopic, and the parallel binocular stereovision are shown in Fig. 3.

Fig. 3

Imaging module using rotary platform (a), the equivalent convergent stereoscopic model is shown in (b) and the parallel binocular stereovision model (c).



3-D positioning

Epipolar constraint has an important role in stereo matching algorithm. It specifies the stereo matching process and affects the efficiency greatly. For the corrected images, the calculated coordinates of the target spot center should have the same v coordinates. After solving the reprojection matrix Q, the 3-D coordinates of the targets can be calculated by obtaining the 2-D coordinates (u,v) on imaging plane and the associated disparity d.



The 3-D coordinates of the target is (X/W, Y/W, Z/W) under the left camera coordination system. When the center coordinates of the target spot at left camera image is (u1,v1), and the coordinates at right camera image is (u2,v2), the disparity d is d=u2u1.


Experimental Results


Camera Calibration

The proper calibration checkerboard is printed according to the size of the field of view. In the bright field, the checkerboard is placed on the rotary platform at different angles and positions. After fixing, a set of images is captured with an angle of 0 and 4 deg, ensuring that there is no relative movement between checkerboard and the rotary platform. Open CV functions are used to calculate the internal parameters matrix M and the distortion parameter D. Obtain the rotation matrix R and the translation vector T. The reprojection matrix Q for 3-D location also needs to be calculated.


Target Extraction

To test the performance of our target extraction algorithm based on multispectral imaging, three different quantum dots with emission spectrum 720, 770, and 840 nm are chosen. The quantum dots are hypodermically injected into mice at three different positions. All three positions are around the dorsum of the mice. The fluorescent images are captured by designed in-vivo fluorescent sensing system. Image data are shown in Fig. 4 with the methods described above for fluorescent signal unmixing and extraction. The result is shown in Fig. 5. For each different quantum dots, pseudocolor is imposed for result demonstration. Thus, this method can pick out target signals with different spectral properties well. Gray gravity method is used to calculate the center of the fluorescence spot after multispectral unmixing. The result is shown in Fig. 6.

Fig. 4

Image data of three different quantum dots with three different emission spectrums in mice. Each image is captured in a different spectral band, 711 nm (a), 759 nm (b), 780 nm (c), 794 nm (d), 850 nm (e), 880 nm (f), 904 nm (g), 938 nm (h).


Fig. 5

The results of target extraction. Three different targets are shown in (a), (b), (c), respectively. Pseudo-color is imposed for result demonstration in (d).


Fig. 6

The center coordinates of the fluorescence target spot is calculated after multispectral unmixing using gray gravity method. The result is shown in (b).



Quantitative Analyses Using Tissue Equivalent Material

In order to simulate the role of light absorption and scattering of small animal’s tissue, equivalent material with similar optical properties is selected for simulation experiment. We select agar-gel, and the preparation method is as follows: First of all, 2.4 g agarose is added in 110 ml phosphate buffer (phosphate-buffered Saline), stirring constantly while heating, until clear bubble-free solution is obtained. The solution is cooled to 70°C, shaken the beaker gently, and added 10.5 ml 20% fat emulsion into it. The solution is poured out, cooled, and solidified. The solidified agar-gel is cut to the shape shown in Fig. 7(a). A shallow hole is dug on the arc surface, used for quantum dot injection. Agar-gel slices with thickness of 2, 4, and 6 mm are prepared. The slices are used for certain depth simulation by covering on the arc surface [see Fig. 7(b)].

Fig. 7

The shape of tissue equivalent material for quantitative analyses (a). The slices cover on the arc surface for certain depth simulation (b).


About 3μl quantum dots (QDs) solution is injected into the shallow hole on the arc surface. The agar-gel without covering is placed in the system and took multiperspective multispectral imaging. We then use the 2-, 4-, and 6-mm slices covering on the arc surface and taking multiperspective multispectral imaging using the system, respectively. The results are processed by multispectral unmixing and target spot center determination using the method mentioned above. Without covering, the spot center coordinates calculated from two perspectives are (212.8184, 236.4625), (291.7266, 236.4625). The coordinates are (212.5323, 236.7462), (291.4989, 236.7462); (212.2375, 237.0362), (291.4989, 237. 0362); (211.6789, 237.5191), (290.9237, 237.5191), when 2-, 4-, and 6-mm slices cover on the arc surface, respectively. Using the reprojection matrix Q to calculate the four groups of data, the 3-D coordinates under the left camera coordinates system are (46.83,13.15,240.10); (46.83,13.10,239.92); (46.82,13.03,239.62); (46.80,12.93,239.17), respectively. As can be seen, with the increase of slice’s thickness, the accuracy of the data has a certain reduction but very slightly. This experiment simulates the in-vivo fluorescent target at different depths.


3-D Positioning

After obtaining the 2-D center coordinates of the target spot at both the left and right cameras, the 3-D coordinates of it under the left camera coordinate system can be acquired according to Eq. (7) and the calibration results. To obtain a more intuitive position, a standard checkerboard is placed on the rotary platform to create a reference coordinate system. Select feature points and the relationship between the references coordinates system and the left camera coordinates system can be learned. The target coordinates conversion from left camera system to the reference system is achieved. The 3-D coordinates of the three targets shown in Fig. 4 under the reference coordinates system are (38.3, 48.2, 21.6), (53.5, 65.6, 28.5), (62.5, 82.7, 26.2), respectively.

To test the accuracy of the system, a rectangular plate with no fluorescent is selected, and four QDs solution drops (about 2μl) are dripped on the four corners of the plate with accurate distance. The actual distances between the drops are 40 and 80 mm. Using the system for imaging and calculating, the 3-D coordinates of the four drops under the reference coordinates system are (43.3, 49.9, 8.6), (43.5, 73.2, 8.8), (54.9, 49.6, 8.8), (55.1, 72.9, 8.9). The calculated distances between these four drops are 79.86, 39.78, 80.12, and 39.83 mm, and this experiment indicates that the system can achieve high accuracy.


Performance of Detecting Depth

We tested the capability of deep target detecting on our designed in-vivo fluorescent sensing system by comparing commercialized machine results (Maestro™ CRi, USA). We tested the penetration of Au:CdHgTe quantum dots in muscle and adipose tissue. Au:CdHgTe quantum dots are NI gold-doped CdHgTe QDs with higher photoluminescence and lower cytotoxicity. In the middle of a 96-holes plate, 50μl Au:CdHgTe quantum dots (emission peak at 840 nm) were added and covered by the muscle tissue. The results show that penetration depth is improved both in muscle and adipose tissue by our designed system. Figure 8 shows the penetration of Au:CdHgTe quantum dots in the muscle tissue. The images at the row (a) are captured using Maestro™ CRi instrument. The row (b) shows our system obtained images. The muscle tissue above the quantum dots is 0 mm (I), 20 mm (II), 25 mm (III), 30 mm (IV), respectively. Figure 9 shows a similar experiment that tests the penetration in the adipose tissue. The adipose tissue above the quantum dots is 30 mm (I) and 35 mm (II) thick. As can be seen, the difference between two systems is insignificant when the target depth is shallow. However, with the increase of the covering tissue’s thickness, the proposed system shows better imaging results. The experiment shows that when the target is quite deep, the penetration depth is increased by 20% in muscle tissue and increased by 16% in adipose tissue, using the proposed system compared with a commercialized machine results (Maestro CRi).

Fig. 8

The penetration of Au:CdHgTe quantum dots in the muscle tissue. The images in row (a) are captured using MaestroTM CRi instrument. Row (b) shows our system obtained images. The muscle tissue above the quantum dots is 0 mm (I), 20 mm (II), 25 mm (III), 30 mm (IV), respectively.


Fig. 9

The penetration of Au:CdHgTe quantum dots in the adipose tissue. The adipose tissue is 30-mm thick in column (I) and 35-mm thick in column (II). The row (a) is MaestroTM CRi captured photographs. Images at the row (b) are obtained using our designed imaging system.




In this paper, the development of a NI in-vivo fluorescence imaging system is presented, and an algorithm for the 3-D coordinates computing based on binocular stereovision theory is demonstrated. The system includes a deep cooling CCD, liquid-core fiber, high-energy laser source, rotary platform, and anesthesia apparatus to realize a dependable system and to detect fluorescence signal from deep sources with high sensitivity. This system can provide more accurate target location by calculating 3-D coordinates. The algorithm has several features as image preprocessing, multispectral unmixing, target spot center calculating, camera calibration, stereo calibration, and 3-D positioning. The validity of the algorithm has been verified by the experiments on the mouse. The designed 3-D positioning will provide more accurate target location information to help researchers analyze the target metastasis as well as determine the relative positional relationship between the target and the organs.

Experimental results of the mouse and pork meat with NIR quantum dots demonstrated that the potential of the designed imaging system for in-vivo optical diagnostics from deep sources and accurate position calculation. It is expected that this NI in-vivo fluorescence imaging system will show its advantage in tumor marker detecting, drug tracking, and also in pharmacokinetic studies as well as in the assessment of tissue response to therapy.


This work was supported by the National Natural Science Foundation of China (31270907, 61106071, 21275129), National Natural Science Foundation of China (61106071), National Key Foundation for Exploring Scientific Instruments (2013YQ470781), and the State Key Laboratory of Industrial Control Technology, Zhejiang University, China.



R. Atreyaet al., “In vivo imaging using fluorescent antibodies to tumor necrosis factor predicts therapeutic response in Crohn’s disease,” Nat. Med. 20(3), 313–318 (2014).1078-8956http://dx.doi.org/10.1038/nm.3462Google Scholar


M. Autieroet al., “In vivo tumor detection in small animals by hematoporphyrin-mediated fluorescence imaging,” Photomed. Laser Surg. 28, S97–S103 (2010).PLDHA81549-5418http://dx.doi.org/10.1089/pho.2009.2567Google Scholar


J. H. Ryuet al., “In vivo fluorescence imaging for cancer diagnosis using receptor-targeted epidermal growth factor-based nanoprobe,” Biomaterials 34(36), 9149–9159 (2013).BIMADU0142-9612http://dx.doi.org/10.1016/j.biomaterials.2013.08.026Google Scholar


R. V. PapineniG. FekeS. Orton, “In vivo fluorescence imaging of internally illuminated molecular probes,” Faseb J. 25, 1126.2 (2011).FAJOEC0892-6638Google Scholar


A. F. Steinet al., “Intravascular near infrared fluorescence molecular imaging identifies macrophages in vivo in thrombosis-prone plaques,” Circulation 128(22), A16926 (2013).CIRCAZ0009-7322Google Scholar


P. Chtcheprovet al., “A high-resolution in vivo molecular imaging technique based on x-ray fluorescence,” Med. Phys. 39(6), 3620–3620 (2012).MPHYA60094-2405http://dx.doi.org/10.1118/1.4734698Google Scholar


Y. Ardeshirpouret al., “Using in-vivo fluorescence imaging in personalized cancer diagnostics and therapy, an image and treat paradigm,” Technol. Cancer Res. Treat. 10(6), 549–560 (2011).1533-0346http://dx.doi.org/10.1177/153303461101000605Google Scholar


S. Andersson Engelset al., “In vivo fluorescence imaging for tissue diagnostics,” Phys. Med. Biol. 42(5), 815–824 (1997).PHMBA70031-9155http://dx.doi.org/10.1088/0031-9155/42/5/006Google Scholar


A. Rehemtullaet al., “Rapid and quantitative assessment of cancer treatment response using in vivo bioluminescence imaging,” Neoplasia 2(6), 491–495 (2000).1522-8002http://dx.doi.org/10.1038/sj.neo.7900121Google Scholar


V. V. Tuchin, “Tissue optics: tomography and topography,” Proc. SPIE 3726, 168–198 (1999).PSISDG0277-786Xhttp://dx.doi.org/10.1117/12.341389Google Scholar


J. Nappet al., “Targeted luminescent near-infrared polymer-nanoprobes for in vivo imaging of tumor hypoxia,” Anal. Chem. 83(23), 9039–9046 (2011).ANCHAM0003-2700http://dx.doi.org/10.1021/ac201870bGoogle Scholar


Y.-P. Guet al., “Ultrasmall near-infrared Ag2Se quantum dots with tunable fluorescence for in vivo imaging,” J. Am. Chem. Soc. 134(1), 79–82 (2012).JACSAT0002-7863http://dx.doi.org/10.1021/ja2089553Google Scholar


M. A. Calfonet al., “In vivo near infrared fluorescence (NIRF) intravascular molecular imaging of inflammatory plaque, a multimodal approach to imaging of atherosclerosis,” J. Vis. Exp. 54, e2257 (2011).1940-087Xhttp://dx.doi.org/10.3791/2257Google Scholar


Y. Caoet al., “Near-infrared quantum-dot-based non-invasive in vivo imaging of squamous cell carcinoma U14,” Nanotechnology 21(47), 475104 (2010).NNOTER0957-4484http://dx.doi.org/10.1088/0957-4484/21/47/475104Google Scholar


S. H. Hanet al., “Au:CdHgTe quantum dots for in vivo tumor-targeted multispectral fluorescence imaging,” Anal. Bioanal. Chem. 403(5), 1343–1352 (2012).ABCNBP1618-2642http://dx.doi.org/10.1007/s00216-012-5921-yGoogle Scholar


J. Becet al., “Multispectral fluorescence lifetime imaging system for intravascular diagnostics with ultrasound guidance: in vivo validation in swine arteries,” J. Biophotonics 7(5), 281–285 (2014).JBOIBX1864-063Xhttp://dx.doi.org/10.1002/jbio.v7.5Google Scholar


M. Hassanet al., “Fluorescence lifetime imaging system for in vivo studies,” Mol. Imaging 6(4), 229–236 (2007).MIOMBP1535-3508Google Scholar


Y. Inoueet al., “In vivo fluorescence imaging of the reticuloendothelial system using quantum dots in combination with bioluminescent tumour monitoring,” Eur. J. Nucl. Med. Mol. I 34(12), 2048–2056 (2007).EJNMA61619-7070http://dx.doi.org/10.1007/s00259-007-0583-2Google Scholar


R. N. Razanskyet al., “Near-infrared fluorescence catheter system for two-dimensional intravascular imaging in vivo,” Opt. Express 18(11), 11372–11381 (2010).OPEXFF1094-4087http://dx.doi.org/10.1364/OE.18.011372Google Scholar


X. F. Zhanget al., “Development of a noncontact 3-D fluorescence tomography system for small animal in vivo imaging,” Proc. SPIE 7191, 71910D (2009).http://dx.doi.org/10.1117/12.808199Google Scholar


X. L. BaoM. G. Li, “Defocus and binocular vision based stereo particle pairing method for 3D particle tracking velocimetry,” Opt. Laser Eng. 49(5), 623–631 (2011).OLENDN0143-8166http://dx.doi.org/10.1016/j.optlaseng.2011.01.015Google Scholar


M. CaoG. M. ZhangY. M. Chen, “Stereo matching of light-spot image points in light pen in binocular stereo visual,” Optik 125(3), 1366–1370 (2014).OTIKAJ0030-4026http://dx.doi.org/10.1016/j.ijleo.2013.08.029Google Scholar


H. C. Lougeut-Higgins, “A computer algorithm for reconstructing a scene from two projections,” in Reading in Computer Vision: Issues, Problems, Principles, and Paradigms, M. A. FishlerO. Firschein, Eds., pp. 61–62, Morgan Kaufmann, San Francisco (1987).Google Scholar


U. WeidnerW. Forstner, “Towards automatic building extraction from high-resolution digital elevation models,” ISPRS J. Photogramm. 50(4), 38–49 (1995).IRSEE90924-2716http://dx.doi.org/10.1016/0924-2716(95)98236-SGoogle Scholar


Z. H. Zhang, “Digital photogrammetry and computer vision,” Geomatics Sci. Wuhan Univ. 29(12), 1035–1105 (2004).Google Scholar


P. F. Luoet al., “Accurate measurement of 3-dimensional deformations in deformable and rigid bodies using computer vision,” Exp. Mech. 33(2), 123–132 (1993).EXMCAZ0014-4851http://dx.doi.org/10.1007/BF02322488Google Scholar


Z. X. Huet al., “Study of the performance of different subpixel image correlation methods in 3D digital image correlation,” Appl. Opt. 49(21), 4044–4051 (2010).APOPAI0003-6935http://dx.doi.org/10.1364/AO.49.004044Google Scholar


M. Tateiba, “Theoretical study on wave propagation and scattering in random media and its application,” IEICE Trans. Electron. E93c(1), 3–8 (2010).IELEEJ0916-8524http://dx.doi.org/10.1587/transele.E93.C.3Google Scholar


Y. A. Kravtsov, “New effects in wave-propagation and scattering in random-media,” Appl. Opt. 32(15), 2681–2691 (1993).APOPAI0003-6935http://dx.doi.org/10.1364/AO.32.002681Google Scholar


Z. X. Huet al., “Fluorescent stereo microscopy for 3D surface profilometry and deformation mapping,” Opt. Express 21(10), 11808–11818 (2013).OPEXFF1094-4087http://dx.doi.org/10.1364/OE.21.011808Google Scholar


X. Feaset al., “New near ultraviolet laser-induced native fluorescence detection coupled to HPLC to analyse residues of oxolinic acid and flumequine: a comparison with conventional xenon flash lamp,” Cyta-J Food 7(1), 15–21 (2009).1947-6337http://dx.doi.org/10.1080/11358120902850552Google Scholar


R. Diamantet al., “High-brightness diode pump sources for solid-state and fiber laser pumping across 8xx-9xx nm range,” Proc. SPIE 8039, 80390E (2011).PSISDG0277-786Xhttp://dx.doi.org/10.1117/12.883831Google Scholar


L. Zhanget al., “Colloidal PbSe quantum dot-solution-filled liquid-core optical fiber for 1.55 mu m telecommunication wavelengths,” Nanotechnology 25(10), 105704 (2014).NNOTER0957-4484http://dx.doi.org/10.1088/0957-4484/25/10/105704Google Scholar


D. C. Chen, “Research of the measuring technique of numerical aperture of optical fiber using near infrared light,” Microw. Opt. Technol. Lett. 50(3), 582–584 (2008).MOTLEO0895-2477http://dx.doi.org/10.1002/(ISSN)1098-2760Google Scholar


D. L. Markset al., “Study of an ultrahigh-numerical-aperture fiber continuum generation source for optical coherence tomography,” Opt. Lett. 27(22), 2010–2012 (2002).OPLEDP0146-9592http://dx.doi.org/10.1364/OL.27.002010Google Scholar


R. YangH. L. MaH. P. Jin, “Influencing factors of external fluorescence seeding enhancing stimulated Raman scattering in liquid-core optical fiber,” J. Raman Spectrosc. 44(12), 1689–1692 (2013).JRSPAF0377-0486http://dx.doi.org/10.1002/jrs.v44.12Google Scholar


P. C. AshokB. B. PraveenK. Dholakia, “Near infrared spectroscopic analysis of single malt Scotch whisky on an optofluidic chip,” Opt. Express 19(23), 22982–22992 (2011).OPEXFF1094-4087http://dx.doi.org/10.1364/OE.19.022982Google Scholar


K. SperlichH. Stolz, “Quantum efficiency measurements of (EM)CCD cameras: high spectral resolution and temperature dependence,” Meas. Sci. Technol. 25(1), 015502 (2014).MSTCEP0957-0233http://dx.doi.org/10.1088/0957-0233/25/1/015502Google Scholar


S. Nikzadet al., “Delta-doped electron-multiplied CCD with absolute quantum efficiency over 50% in the near to far ultraviolet range for single photon counting applications,” Appl. Opt. 51(3), 365–369 (2011).APOPAI0003-6935http://dx.doi.org/10.1364/AO.51.000365Google Scholar


Z. J. Caiet al., “Quantification and elimination of the CCD dark current in weak spectrum measurement by modulation and correlation method,” Proc. SPIE 7850, 785005 (2010).0277-786Xhttp://dx.doi.org/10.1117/12.869369Google Scholar


J. Hewettet al., “The application of a compact multispectral imaging system with integrated excitation source to in vivo monitoring of fluorescence during topical photodynamic therapy of superficial skin cancers,” Photochem. Photobiol. 73(3), 278–282 (2001).PHCBAP0031-8655http://dx.doi.org/10.1562/0031-8655(2001)0730278TAOACM2.0.CO2Google Scholar


M. S. KimA. M. LefcourtY. R. Chen, “Multispectral laser-induced fluorescence imaging system for large biological samples,” Appl. Opt. 42(19), 3927–3934 (2003).APOPAI0003-6935http://dx.doi.org/10.1364/AO.42.003927Google Scholar


B. M. W. de Vrieset al., “Multispectral near-infrared fluorescence molecular imaging of matrix metalloproteinases in a human carotid plaque using a matrix-degrading metalloproteinase-sensitive activatable fluorescent probe,” Circulation 119(20), E534–E536 (2009).CIRCAZ0009-7322http://dx.doi.org/10.1161/CIRCULATIONAHA.108.821389Google Scholar


M. E. Martinet al., “Development of an advanced hyperspectral imaging (HSI) system with applications for cancer detection,” Ann. Biomed. Eng. 34(6), 1061–1068 (2006).ABMECF0090-6964http://dx.doi.org/10.1007/s10439-006-9121-9Google Scholar


OpenCV dev team, “Camera Calibration and 3D Reconstruction,” OpenCV documentation (21 April 2014).Google Scholar


Z. Y. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. 22(11), 1330–1334 (2000).ITPIDJ0162-8828http://dx.doi.org/10.1109/34.888718Google Scholar


D. C. Brown, “Close-range camera calibration,” Photogramm. Eng. 37(8), 855–866 (1971).PGMEA90099-1112Google Scholar


T. Y. Young, Handbook of Pattern Recognition and Image Processing: Computer Vision, Vol. 2, Academic Press, Inc., Orlando, FL (1994).Google Scholar

Biographies of the authors are not available.

Bofan Song, Wei Jin, Ying Wang, Qinhan Jin, Ying Mu, "<italic<In vivo</italic< near-infrared fluorescence three-dimensional positioning system with binocular stereovision," Journal of Biomedical Optics 19(11), 116002 (3 November 2014). http://dx.doi.org/10.1117/1.JBO.19.11.116002
Submission: Received ; Accepted


In vivo imaging

3D acquisition

Imaging systems


3D image processing

Signal detection


Back to Top