The optical requirements of fully immersive head mounted AR devices are inherently determined by the human visual system. The etendue of the visual system is large. As a consequence, the requirements for fully immersive head-mounted AR devices exceeds almost any high end optical system. Two promising solutions to achieve the large etendue and their challenges are discussed. Head-mounted augmented reality devices have been developed for decades – mostly for application within aircrafts and in combination with a heavy and bulky helmet. The established head-up displays for applications within automotive vehicles typically utilize similar techniques. Recently, there is the vision of eyeglasses with included augmentation, offering a large field of view, and being unobtrusively all-day wearable. There seems to be no simple solution to reach the functional performance requirements. Known technical solutions paths seem to be a dead-end, and some seem to offer promising perspectives, however with severe limitations. As an alternative, unobtrusively all-day wearable devices with a significantly smaller field of view are already possible.
An extension of paraxial theory to systems with a single plane of symmetry is provided. This parabasal model is based on the evaluation of a differential region around the reference ray that is defined by the center of the object and the center of the stop. To include freeform surfaces in this model, the local curvatures at the intersection point of the reference ray and the surface are evaluated. As an application, a generalized Scheimpflug principle is presented. The validity of the derived formulas is tested for highly tilted surfaces and is in good agreement with the exact ray tracing results. The analytical expressions are used to provide a first-order layout design of a planar imaging system.
In this paper we present chromatic confocal distance sensors for the parallelized evaluation at several lateral positions.
The multi-point measurements are performed using either one- or two-dimensional detector arrays. The first sensor combines
the concepts of confocal matrix sensing and snapshot hyperspectral imaging to image a two-dimensional array of
laterally separated points with one single shot. In contrast to chromatic confocal matrix sensors which use an RGB detector
our system works independently from the spectral reflectivity of the surface under test and requires no object-specific
calibration. Our discussion of this sensor principle is supported by experimental results. The second sensor is a multipoint line sensor aimed at high speed applications with frame rates of several thousand frames per second. To reach this evaluation speed a one-dimensional detector is employed. We use spectral multiplexing to transfer the information from different measurement points through a single fiber and evaluate the spectral distribution with a conventional spectrometer. The working principle of the second sensor type is demonstrated for the example of a three-point sensor.
Dispersion causes the focal lengths of refractive and diffractive optical elements to vary with wavelength. In our contribution
we show how it can be used for chromatic encoding and decoding of optical signals. We specifically discuss how
these concepts can be applied for the implementation of systems with applications in the growing fields of hyperspectral
imaging and chromatic distance coding. Refractive systems as well as hybrid combinations of diffractive and refractive
elements are used to create specific chromatic aberrations of the sensors. Our design approach enables the tailoring of the
sensor properties to the measurement problem and assists designers in finding optimized solutions for industrial applications.
The focus of our research is on parallelized imaging systems that cover extended objects. In comparison to point
sensors, such systems promise reduced image acquisition times and an increased overall performance. Concepts for
three-dimensional profilometry with chromatic confocal sensor systems as well as spectrally resolved imaging of object
scenes are discussed.
A microlens suitable for integration with photonic elements on the same substrate is presented. It is fabricated utilizing
planar standard technologies such as UV lithography, ICP-CVD and Deep Reactive Ion Etching. For reaching an optical
3D functionality with 2 D structuring methods a variation of the refractive index during the layer deposition process in
the vertical direction is used. For the horizontal direction, parallel to the substrate, the shape of etched side walls
determines the focus. This procedure allows the independent control of light propagation in two perpendicular directions
with planar technologies. To demonstrate the potential of the technology, optical elements for the collimation of fiber-based
light sources are presented.