In this paper, we propose the see-through parallax barrier type multi-view display with transparent liquid crystal display
(LCD). The transparency of LCD is realized by detaching the backlight unit. The number of views in the proposed
system is minimized to enlarge the aperture size of parallax barrier, which determines the transparency. For
compensating the shortness of the number of viewpoints, eye tracking method is applied to provide large number of
views and vertical parallax. Through experiments, a prototype of see-through autostereoscopic 3D display with parallax
barrier is implemented, and the system parameters of transmittance, crosstalk, and barrier structure perception are
Conventionally the elemental lenses of the lens-array used in integral imaging have spherical surface profiles, thus they suffer from intrinsic lens aberrations such as spherical aberration and astigmatism. Aberrations affect the ability of the lens to focus light in a single point, or to collimate light from a point source. In integral imaging, this results in a loss of image quality of the reconstructed image due to distortions. The viewing characteristics of the integral imaging system, such as viewing angle and image resolution, are also affected by aberrations. We propose the use of a custom made aspherical lens-array which was specifically designed to minimize distortions due to aberrations and hence improve the reconstructed image quality. Ray optics calculations are used in order to analyze the aberrations and find the initial lens surface profile. Lens optimization is performed with the aid of numerical simulation software. The designed lens-array is compared to a conventional spherical lens-array of same properties. The design, optimization, and fabrication processes are described and the experiments are presented and compared with the computer simulations.
To capture the three-dimensional (3D) information of microscopic (micro) object, the light field microscopy (LFM) has been studied. A lens array is inserted into the conventional microscope and 3D information of micro object is captured in single shot. However, since the lateral resolution decreases severely because of lens array, the integral floating microscopy (IFM) is proposed. The IFM is modified version of the LFM which concentrates on the lateral resolution rather than the angular resolution by changing the location of specimen and the lens array. The specimen should be located at the front focal plane and the lens array should be located at the back focal plane of the objective lens in the IFM but it is hard to locate the lens array into the back focal length of the objective lens because the back focal length lies in the barrel of the objective lens in general. In this paper, we propose the modified version of the integral floating microscopy which can place the lens array at the optimum position. The structure of the whole system is changed and the relay lens is added to relay the back focal length outside. By placing the lens array at the optimum position, the captured information could be maximized, and by changing the focal length of the relay lens, the field of view (FOV) mismatch problem can be also mitigated. The relationship between the captured information and the specification of the system is analyzed and proper experiments are presented for the verification.
Introduction of adaptive optics technology into astronomy and ophthalmology has made great contributions in these fields, allowing one to recover images blurred by atmospheric turbulence or aberrations of the eye. Similar adaptive optics improvement in microscopic imaging is also of interest to researchers using various techniques. Current technology of adaptive optics typically contains three key elements: a wavefront sensor, wavefront corrector, and controller. These hardware elements tend to be bulky, expensive, and limited in resolution, involving, for example, lenslet arrays for sensing or multiactuator deformable mirrors for correcting. We have previously introduced an alternate approach based on unique capabilities of digital holography, namely direct access to the phase profile of an optical field and the ability to numerically manipulate the phase profile. We have also demonstrated that direct access and compensation of the phase profile are possible not only with conventional coherent digital holography, but also with a new type of digital holography using incoherent light: selfinterference incoherent digital holography (SIDH). The SIDH generates a complex—i.e., amplitude plus phase—hologram from one or several interferograms acquired with incoherent light, such as LEDs, lamps, sunlight, or fluorescence. The complex point spread function can be measured using guide star illumination and it allows deterministic deconvolution of the full-field image. We present experimental demonstration of aberration compensation in holographic fluorescence microscopy using SIDH. Adaptive optics by SIDH provides new tools for improved cellular fluorescence microscopy through intact tissue layers or other types of aberrant media.
Fluorescence microscopy is an indispensable imaging tool in modern biomedical research. Holography is well-known to have many interesting and useful imaging capabilities. But the requirement of coherent illumination has all but precluded holography as a means for fluorescence imaging, which is inherently incoherent. Recent developments in digital holography, however, including self-interference incoherent digital holography (SIDH), provide highly effective and versatile capabilities for 3D holographic imaging with incoherent light, that can remove the barrier between fluorescence and holography. Recent progress in holographic fluorescence microscopy is presented.
We propose a one-shot dual-dimension microscope which captures 2D/3D information simultaneously based on light field microscopy. By locating a beam splitter into a relayed light field microscopy system, the simultaneous capture of both 2D and 3D information is possible. Two digital cameras are synchronized and simultaneously capture 2D and 3D information, respectively. We also discuss about the way to present 2D and 3D information together efficiently, and the way to develop the 3D depth image quality with the high resolution 2D image information.
We proposed a glasses-free randot stereotest using a multiview display system. We designed a four-view parallax barrier system and proposed the use of a random-dot multigram as a set of view images for the glasses-free randot stereotest. The glasses-free randot stereotest can be used to verify the effect of glasses in a stereopsis experience. Furthermore, the proposed system is convertible between two-view and four-view structures so that the motion parallax effect could be verified within the system. We discussed the design principles and the method used to generate images in detail and implemented a glasses-free randot stereotest system with a liquid crystal display panel and a customized parallax barrier. We also developed graphical user interfaces and a method for their calibration for practical usage. We performed experiments with five adult subjects with normal vision. The experimental results show that the proposed system provides a stereopsis experience to the subjects and is consistent with the glasses-type randot stereotest and the Frisby–Davis test. The implemented system is free from monocular cues and provides binocular disparity only. The crosstalk of the system is about 6.42% for four-view and 4.17% for two-view, the time required for one measurement is less than 20 s, and the minimum angular disparity that the system can provide is about 23 arc sec.
In this paper, we propose a glasses-free random dot stereoacuity test using a multi-view display system. We use a multiview display system with a liquid crystal display panel and a parallax barrier. We generate the random dot base images with different disparities. The multi-view system and the generated base images provide several random dot stereotest images to the patient. The proposed method can offer not only binocular disparity but also motion parallax. We implement 4-view parallax barrier system with a 5K liquid crystal display monitor, and generate the random dot base images for the system. For the practical usage, we also develop graphical user interface of the stereoacuity test which contains the personal calibration function in pixel unit.
Introduction of adaptive optics technology into astronomy and ophthalmology has made great contributions in these fields, allowing one to recover images blurred by atmospheric turbulence or aberrations of the eye. Similar adaptive optics improvement in microscopic imaging is also of interest to researchers using various techniques. Current technology of adaptive optics typically contains three key elements: wavefront sensor, wavefront corrector and controller. These hardware elements tend to be bulky, expensive, and limited in resolution, involving, e.g., lenslet arrays for sensing or multi-acuator deformable mirrors for correcting. We have previously introduced an alternate approach to adaptive optics based on unique capabilities of digital holography, namely direct access to the phase profile of an optical field and the ability to numerically manipulate the phase profile. We have also demonstrated that direct access and compensation of the phase profile is possible not only with the conventional coherent type of digital holography, but also with a new type of digital holography using incoherent light: self-interference incoherent digital holography (SIDH). The SIDH generates complex – i.e. amplitude plus phase – hologram from one or several interferograms acquired with incoherent light, such as LEDs, lamps, sunlight, or fluorescence. The complex point spread function can be measured using a guide star illumination and it allows deterministic deconvolution of the full-field image. We present experimental demonstration of aberration compensation in holographic fluorescence microscopy using SIDH. The adaptive optics by SIDH provides new tools for improved cellular fluorescence microscopy through intact tissue layers or other types of aberrant media.
Our objective is to construct real-time pickup and display in integral imaging system with handheld light field camera. A micro lens array and high frame rate charge-coupled device (CCD) are used to implement handheld light field camera, and a simple lens array and a liquid crystal (LC) display panel are used to reconstruct three-dimensional (3D) images in real-time. Handheld light field camera is implemented by adding the micro lens array on CCD sensor. Main lens, which is mounted on CCD sensor, is used to capture the scene. To make the elemental image in real-time, pixel mapping algorithm is applied. With this algorithm, not only pseudoscopic problem can be solved, but also user can change the depth plane of the displayed 3D images in real-time. For real-time high quality 3D video generation, a high resolution and high frame rate CCD and LC display panel are used in proposed system. Experiment and simulation results are presented to verify our proposed system. As a result, 3D image is captured and reconstructed in real-time through integral imaging system.
We propose a real-time 3D capturing-visualization conversion method of light field microscopy. We implement light
field microscopy system using conventional optical microscopy and micro lens array, and visualize 3D information from
light field microscopy with integral imaging system. We analyze capturing method of light field and how to convert 3D
information from light field microscopy to elemental image with corrected depth information for integral imaging in
real-time. Experimental setup and result images are presented to verify our proposed method.
We propose a novel real-time pickup and display of integral imaging system using only a lens array and a high speed
charge coupled device (CCD). A simple lens array and a high speed CCD can capture 3D information of the object and a
commercial liquid crystal (LC) display panel shows the elemental image in real-time. Reconstructed image is real and
orthographic so that the observer can touch the 3D image. Furthermore, our system is free from pseudoscopic problem
by adopting recent pixel mapping algorithm. This algorithm, based on image interweaving process, can also change the
depth plane of the displayed 3D images in real-time. C++ programming is used for real-time capturing, image processing,
and display. For real-time high quality 3D video generation, a high resolution and high frame rate CCD (AVT Prosilica
GX2300C) and LC display panel (IBM 22inch 3840×2400) are used in proposed system. Proper simulation and
experiment are presented to verify our proposed system. We expect that our research can be the basic technology for
real-time 3D broadcasting and interactive 3D technology.
We propose three-dimensional floating display which uses a concave cylindrical mirror (CCM), wedge prisms, and a
digital micro-mirror device (DMD). Wedge prisms can make the direction of projected images tilt by a specific angle
from the incident direction. In our system, wedge prisms are rotated to project images to whole direction of the
cylindrical mirror. Projected images from the DMD projector are reflected and distorted by the CCM simultaneously. We
generate inversely distorted images to correct the image distortions to display original images. As the wedge prisms
rotate, the tilt angle in the longitudinal plane of the CCM rotates. This means that images from the DMD can be
projected in any horizontal direction. Viewers can see 3D objects with horizontal parallax in any horizontal direction.
The further explanation of the proposed system is provided, and the experimental results are also presented.
Three-dimensional (3D) display has attracted considerable attention in recent years because of development in display
technology. Various methods for realizing 3D display have been proposed; among them, multi-view display could be
practical to implement before aspiring 3D display. The term of multi-view display system based on autostereoscopic
display has the meaning of view splitting; the view images are projected to the pre-defined positions from the same
display device. Therefore the users located at the correct positions can see corresponding images. Although the multi-view
display technique has been studied by many research groups, the fundamental importance of the sound with display
has not, so far, been noticed nor has been examined in detail. The purpose of this paper is to realize a multi-view display
system with directional sound, which allows the individual observer to experience directional sound in multi-view
display environment. The explanation and experimental results of the proposed system are provided.