PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
This PDF file contains the front matter associated with SPIE Proceedings Volume 11040, including the Title Page, Copyright information, Table of Contents, Author and Conference Committee lists.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Despite the great advances, potentials of augmented reality to fundamentally transform the way people use computers is partially hindered by the size and weight of the AR headsets. In waveguide-based devices, the light engine constitutes a significant portion of the total volume and weight. Dielectric metasurfaces have in recent years been used to demonstrate various high performance optical elements like blazed gratings and wide field of view lenses with small thicknesses, high efficiencies, and little stray light. Here, we report our work on the design of a compact light engine based on multi-metasurface optics with wide fields of view, integrated with three monochrome μ-LED displays for red, green, and blue. The metasurfaces image the μ-LEDs on the prism or grating couplers. This design avoids an important shortcoming of μ-LEDs and metasurface lenses, i.e., each work well for a single wavelength. As an example, we present a design for 532 nm, with over 3000 resolved angular points in an 8-mm-diameter field of view, and a total volume less than 0.65 cc (<2 cc for the three wavelengths). Limited by the total internal reflection region inside a waveguide with a 1.78 refractive index, the light engine can produce an image with over 1500x1500 points over a field of view slightly larger than 85°x85° in air. To the best of our knowledge, this is the first proposal and demonstration of such a system and therefore opens the path towards exploring the potentials of the metasurface diffractive optics technology for compact AR headsets with enhanced optical capabilities.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The conventional depth map sensors have limited depth range, or they need to sacrifice depth accuracy for a larger working range. To overcome such problems, in this work, we propose a co-axial depth map sensor with an extended depth range based on controlled aberrations. This depth map sensor implements depth measurement by projecting a near-infrared astigmatic pattern onto the test scene and measuring the contrast change of the reflected pattern image in the tangential and sagittal directions. By adding a tunable lens in the projection optics, this depth map sensor can achieve the extended depth measurement without the loss of high depth accuracy and high depth map resolution.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We developed a novel concept of retinal projection for augmented reality (AR) glasses combining optical integrated optics and holography. Our thin and lens-free concept overcomes limitations of current AR devices such as bulky optics and limited field-of-view. The integrated circuit is transparent and guide visible wavelengths by using Si3N4 as the core material of the waveguides. This work presents a detailed description of the optical principles behind the concept, including the self-focusing effect. Furthermore, we present the design of the first building blocks used for the optical integrated circuit at a visible wavelength (λ = 532 nm): single-mode waveguides, bent waveguides, cross-talk, grating couplers and MMI splitters (MultiMode Interference). Numerical simulation results of each component are presented. A prototype combining these optical building blocks in a 1024 waveguide array is designed to provide future experimental proof of concept of our retinal projection concept. In addition to this prototype, test structures are inserted on a photolithography mask to experimentally validate the simulations of each optical building block in future work. Next steps of development will include densifying the integrated optical architecture using serial coupling effects and multiple waveguide layers.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We propose a design of a retinal-projection-based near-eye display for achieving ultra-large field of view, vision correction, and occlusion. Our solution is highlighted by a contact lens combo, a transparent organic light-emitting diode panel, and a twisted nematic liquid crystal panel. Its design rules are set forth in detail, followed by the results and discussion regarding the field of view, angular resolution, modulation transfer function, contrast ratio, distortion, and simulated imaging.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper presents the Zerotrope, an improvement of the classic phenakistiscope and zoetrope devices, used to create a new 360-degree 3D display by addition of a single ultra-realistic full-color hologram. The Zerotrope is built with a single zero-degree transplane hologram mounted on a disc rotating at constant speed. This hologram displays a series of 3D characters showing the sequential phases of an animation and arranged radially around the center of the disc. When a stroboscopic lamp synchronized with the rotation illuminates this hologram, the recorded characters are animated as in a stop-motion movie. The operation of the Zerotrope is successfully demonstrated and shows the effect of the holographic reality (HR) without the need for special glasses or other viewing aids.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Head-up displays offer ease-of-use and safety advantages over traditional head-down displays when implemented in aircraft and vehicles. Unfortunately, in the traditional head-up display projection method, the size of the image is limited by the size of the projection optics. In many vehicular systems, the size requirements for a large field of view head-up display exceed the space available to allocate for these projection optics. Thus, an alternative approach is needed to present a large field of view image to the user. By using holographic optical elements affixed to waveguides, it becomes possible to reduce the size of the projection system, while producing a comparatively large image. Additionally, modulating the diffraction efficiency of some of the holograms in the system presents an expanded viewing eyebox to the viewer. This presentation will discuss our work to demonstrate a magnified far-field image with an in-line two-dimensional eyebox expansion. It will explore recording geometries and configurations and will conclude by discussing challenges for future implementation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
An innovative concept is proposed for an optical element which offers the capability of rapidly switching the optical power of the system among multiple foci. The switchable multifocal element consists of a custom-designed freeform lens offering multiple discrete foci and a programmable high-speed liquid crystal shutter (LCS). The freeform lens is divided into patterned zones, through which multiple distinct foci are produced. The LCS consists of patterned zones corresponding to those zones of the freeform lens, which can be programmably switched on and off. By combining the multi-focal freeform lens and the LCS in a time-multiplexed fashion, a switchable multifocal element with high-speed, large aperture and large range of tunable power was achieved. The multifocal element also meets the other requirement of an ideal tunable optical element such as low-voltage control, robustness, and compactness. A proof-of-concept twofocal head mounted display was designed to demonstrate one application of the new switchable multifocal element. The design can provide a FOV of 40 degrees and angular resolution of 1 arc minutes in visual space in an 8mm by 8mm exit pupil.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We present a concept for virtual reality (VR) headsets which is inspired by the design of the human eye itself. By using a rotatable display system which resembles a mechanical copy of the eye, we achieve a high resolution at the foveal spot and lower resolution in the periphery while maintaining a large field of view. Fast and accurate retinal eye tracking by observing the blind spot on the fovea centralis is possible with this solution. The vergence-accomodation conflict can be solved potentially by integrating an off-the-shelf tunable lens.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In this paper, the concept design of the addition of a 3D imaging system to commercially available see-through AR glasses is outlined. The 3D imaging is implemented through the projection of structured infrared light pattern of (λ=1550 nm) dots on a scene in front of the user. The light projector and detector of the light are adjacent to each other on the device frame. The structured light is produced using a diffractive optical element. To equip this 3D imaging system with a lateral sweeping system without the addition of a complex rotating scanner, two right angle prisms are used such that the chord face of each prism is parallel to the other. Given a certain gap between the prisms the angular trajectory of the structured light pattern can be manipulated, thus enabling high quality illumination of the scene at directions other than normal to the aperture of the illuminator. Computer algorithms can be used to calculate the position of each reflected dot given the field of view of the camera. The material of the prisms is a topic under investigation. While one of the prisms has a fixed position, the other is moved linearly away (in the z direction) from the other element using a linear actuator. This linear motion enables a variable gap between the two prisms and scanning the scene for a range of angles as a function of the prism's material properties and detector field of view.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
A quasi-super multi-view head-mounted display (quasi-SMV HMD) is proposed to simplify the optical system and the image processing technique required for SMV HMDs. The proposed technique uses a pair of LED arrays and a flat-panel display (FPD) operating at ~60 Hz. The LED arrays generate multiple viewpoints for each eye in a time division multiplexing manner. Left and right images are displayed on the FPD which is vibrated slightly in synchronization with the viewpoint generation. Thus, DOF enhanced left and right images are displayed at variable depth positions. Preliminary experimental results are presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Currently, commercial head-mounted displays suffer from limited accommodative states, which lead to vergenceaccommodation conflict. In this work, we newly design the architecture of head-mounted display supporting 15 focal planes over wide depth of field (20cm-optical infinity) in real time to alleviate vergence-accommodation conflict. Our system employs a low-resolution vertical scanning backlight, a display panel (e.g. liquid crystal panel), and focus-tunable lens. We demonstrate the compact prototype and verify its performance through experimental results.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We report a polarization-multiplexed additive light field display for near-eye applications. A polarization-sensitive Pancharatnam-Berry phase lens is implemented to generate two focal depths simultaneously. Then, a spatial polarization modulator is utilized to control the polarization state of each pixel and direct the two images to designated focal planes. Based on this design, an additive light field display system is constructed. The vergence-accommodation conflict is suppressed successfully without increasing space and time complexities.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.