Segmented telescopes are a possible approach to enable large-aperture space telescopes for the direct imaging and spectroscopy of habitable worlds. However, the increased complexity of their aperture geometry, due to the central obstruction, support structures and segment gaps, makes high-contrast imaging very challenging. The High-contrast imager for Complex Aperture Telescopes (HiCAT) testbed was designed to study and develop solutions for such telescope pupils using wavefront control and coronagraphic starlight suppression. The testbed design has the flexibility to enable studies with increasing complexity for telescope aperture geometries starting with off-axis telescopes, then on-axis telescopes with central obstruction and support structures - e.g. the Wide Field Infrared Survey Telescope (WFIRST) - up to on-axis segmented telescopes, including various concepts for a Large UV, Optical, IR telescope (LUVOIR). In the past year, HiCAT has made significant hardware and software updates in order to accelerate the development of the project. In addition to completely overhauling the software that runs the testbed, we have completed several hardware upgrades, including the second and third deformable mirror, and the first custom Apodized Pupil Lyot Coronagraph (APLC) optimized for the HiCAT aperture, which is similar to one of the possible geometries considered for LUVOIR. The testbed also includes several external metrology features for rapid replacement of parts, and in particular the ability to test multiple apodizers readily, an active tip-tilt control system to compensate for local vibration and air turbulence in the enclosure. On the software and operations side, the software infrastructure enables 24/7 automated experiments that include routine calibration tasks and high-contrast experiments. In this communication we present an overview and status update of the project, both on the hardware and software side, and describe the results obtained with APLC wavefront control.
Sensors operating in the millimeter wave region of the electromagnetic spectrum provide valuable situational awareness in degraded visual environments, helpful in navigation of rotorcraft and fixed wing aircraft. Due to their relatively long wavelength, millimeter waves can pass through many types of visual obscurants, including smoke, fog, dust, blowing sand, etc. with low attenuation. Developed to take advantage of these capabilities, ourmillimeter wave imager employs a unique, enabling receiver architecture based on distributed aperture arrays and optical upconversion. We have reported previously on operation and performance of our passive millimeter wave imager, including field test results in DVE and other representative environments, as well as extensive flight testing on an H-1 rotorcraft. Herein we discuss efforts to improve RF and optical component hardware integration, with the goal to increase manufacturability and reduce c-SWaP of the system. These outcomes will allow us to increase aperture sizes and channel counts, thereby providing increased receiver sensitivity and overall improved image quality. These developments in turn will open up new application areas for the passive millimeter wave technology, as well as better serving existing ones.
This work presents updates to the coronagraph and telescope components of the Segmented Aperture Interferometric Nulling Testbed (SAINT). The project pairs an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC) towards demonstrating capabilities for the future space observatories needed to directly detect and characterize a significant sample of Earth-sized worlds around nearby stars in the quest for identifying those which may be habitable and possibly harbor life. Efforts to improve the VNC wavefront control optics and mechanisms towards repeating narrowband results are described. A narrative is provided for the design of new optical components aimed at enabling broadband performance. Initial work with the hardware and software interface for controlling the segmented telescope mirror is also presented.
This paper will discuss the development of a millimeter-wave (mm-wave) receiver module used in a sparse array passive imaging system. Using liquid crystal polymer (LCP) technology and low power InP low noise amplifiers (LNA), enables the integration of the digital circuitry along with the RF components onto a single substrate significantly improves the size, weight, power, and cost (SWaP-C) of the mm-wave receiver module compared to previous iterations of the module. Also comparing with previous generation modules, the operating frequency has been pushed from 77 GHz to 95 GHz in order to improve the resolution of the captured image from the sparse array imaging system.
Degraded visual environments create dangerous conditions for aircraft pilots due to loss of situational awareness and/or ground reference, which can result in accidents during navigation or landing. Imaging in millimeter wave spectral bands offers the ability to maintain pilot's situational awareness despite DVE with a "see-through" imaging modality. Millimeter waves exhibit low atmospheric attenuation as well as low scattering loss from airborne particulates, e.g. blowing sand, dust, fog, and other visual obscurants. As such, Phase Sensitive Innovations (PSI) has developed a passive, real-time mmW imager to mitigate brownout dangers for rotorcraft. The imager consists of a distributed aperture array with conversion of detected mmW signals to optical frequencies for processing and image formation. Recently we performed operationally representative flight testing of our sensor while imaging various natural and manmade objects. Here we present imagery collected during these tests as it confirms the performance of the sensor technology and illustrates phenomenology encountered in the mmW spectrum.
The transmission characteristics of millimeter waves (mmWs) make them suitable for many applications in defense and security, from airport preflight scanning to penetrating degraded visual environments such as brownout or heavy fog. While the cold sky provides sufficient illumination for these images to be taken passively in outdoor scenarios, this utility comes at a cost; the diffraction limit of the longer wavelengths involved leads to lower resolution imagery compared to the visible or IR regimes, and the low power levels inherent to passive imagery allow the data to be more easily degraded by noise. Recent techniques leveraging optical upconversion have shown significant promise, but are still subject to fundamental limits in resolution and signal-to-noise ratio. To address these issues we have applied techniques developed for visible and IR imagery to decrease noise and increase resolution in mmW imagery. We have developed these techniques into fieldable software, making use of GPU platforms for real-time operation of computationally complex image processing algorithms. We present data from a passive, 77 GHz, distributed aperture, video-rate imaging platform captured during field tests at full video rate. These videos demonstrate the increase in situational awareness that can be gained through applying computational techniques in real-time without needing changes in detection hardware.
In this presentation we will discuss the performance and limitations of our 220 channel video rate passive millimeter wave imaging system based on a distributed aperture with optical upconversion architecture. We will cover our efforts to reduce the cost, size, weight, and power (CSWaP) requirements of our next generation imager. To this end, we have developed custom integrated circuit silicon-germanium (SiGe) low noise amplifiers that have been designed to efficiently couple with our high performance lithium niobate upconversion modules. We have also developed millimeter wave packaging and components in multilayer liquid crystal polymer (LCP) substrates which greatly improve the manufacturability of the upconversion modules. These structures include antennas, substrate integrated waveguides, filters, and substrates for InP and SiGe mmW amplifiers.
KEYWORDS: Imaging systems, Upconversion, Extremely high frequency, Antennas, Sensors, Image processing, Near field optics, Cameras, Optical scanning systems, Control systems
Passive imaging using millimeter waves (mmWs) has many advantages and applications in the defense and security markets. All terrestrial bodies emit mmW radiation and these wavelengths are able to penetrate smoke, fog/clouds/marine layers, and even clothing. One primary obstacle to imaging in this spectrum is that longer wavelengths require larger apertures to achieve the resolutions desired for many applications. Accordingly, lens-based focal plane systems and scanning systems tend to require large aperture optics, which increase the achievable size and weight of such systems to beyond what can be supported by many applications. To overcome this limitation, a distributed aperture detection scheme is used in which the effective aperture size can be increased without the associated volumetric increase in imager size. This distributed aperture system is realized through conversion of the received mmW energy into sidebands on an optical carrier. This conversion serves, in essence, to scale the mmW sparse aperture array signals onto a complementary optical array. The side bands are subsequently stripped from the optical carrier and recombined to provide a real time snapshot of the mmW signal. Using this technique, we have constructed a real-time, video-rate imager operating at 75 GHz. A distributed aperture consisting of 220 upconversion channels is used to realize 2.5k pixels with passive sensitivity. Details of the construction and operation of this imager as well as field testing results will be presented herein.
A technique is described for displaying polarization information from passive millimeter-wave (mmW) sensors. This technique uses the hue of an image to display the polarization information and the lightness of an image to provide the unpolarized information. The fusion of both images is done in such a way that minimal information is lost from the unpolarized image while adding polarization information within a single image. The technique is applied to experimental imagery collected in a desert environment with two orthogonal linear polarization states of light and the results are discussed. Several objects such as footprints, ground textures, tire tracks, and shrubs display strong polarization features that are clearly visible with this technique, while materials with low polarization signatures such as metal are also clearly visible in the same image.
A passive millimeter-wave sensor based on optical up-conversion that is sensitive to the polarization state of incident
radiation is described. This system up-converts incident millimeter-wave radiation to an optical frequency and then
recreates the polarization state of the millimeter-wave radiation in the optical signal. A division of time approach is then
used to extract the Stokes information from the signal using optical techniques. Results are shown which verify the
feasibility of this approach and demonstrate the ability to control the phase of the signal to enable the measurement of
Stokes information.
In this paper we present an end-to-end system model for passive millimeter wave (mmW) imaging system based
on an optical up-conversion process. Due to the complicated nature of the system, accurate and efficient model of
such a system becomes extremely challenging. To this end, we establish a mathematical model to bridge all
component and subsystem models together to complete the system performance evaluation. The subsystem models
through theoretical simulation and experimental measurement provide accurate input for overall system performance
evaluation. The developed tools have been used for the validation our First-Ever demonstrated passive mmW imager
at 35 GHz and excellent agreement has been achieved between the simulation and experimental results.
The demand for all-weather, day-night imaging systems has been spurred by calls for persistent surveillance in security
and defense applications, and increased safety in military aviation, such as carrier landings in fog and helicopter landings
in sand and dust. To meet these demands requires systems that offer robust imaging capabilities. Whereas visible and
infrared systems can provide high resolution imagery in a small-sized package, they are hindered by atmospheric
obscurants, such as cloud cover, fog, smoke, rain, sand, and dust storms. Millimeter wavelengths, on the other hand, are
not and passive millimeter wave imaging may be one method to reduce, or perhaps even eliminate, the impact of low
visibility atmospheric conditions. In this paper we examine the scattering from rotorcraft induced dust clouds using
Sandblaster dust particle density data. We examine the effect of Mie scattering as a function of particle size and
operating wavelength and conclude that W-band operation yields the highest resolution imaging while still maintaining
"see-through" imaging capability.
Beam steering is an enabling technology for establishment of ad hoc communication links, directed energy for infrared
countermeasures, and other in-theater defense applications. The development of nonmechanical beam steering
techniques is driven by requirements for low size, weight, and power, and high slew rate, among others. The
predominant beam steering technology currently in use relies on gimbal mounts, which are relatively large, heavy, and
slow, and furthermore create drag on the airframes to which they are mounted. Nonmechanical techniques for beam
steering are currently being introduced or refined, such as those based on liquid crystal spatial light modulators;
however, drawbacks inherent to some of these approaches include narrow field of regard, low speed operation, and low
optical efficiency. An attractive method that we explore is based on optical phased arrays, which has the potential to
overcome the aforementioned issues associated with other mechanical and nonmechanical beam steering techniques.
The optical array phase locks a number of coherent optical emitters in addition to applying arbitrary phase profiles
across the array, thereby synthesizing beam shapes that can be steered and utilized for a diverse range of applications.
Currently, brownout is the single largest contributor to military rotary-wing losses. Millimeter-wave radiation
penetrates these dust clouds effectively, thus millimeter-wave imaging could provide pilots with valuable situational
awareness during hover, takeoff, and landing operations. Herein, we detail efforts towards a passive, video-rate
imager for use as a brownout mitigation tool. The imager presented herein uses a distributed-aperture, optically-upconverted
architecture that provides real-time, video-rate imagery with minimal size and weight. Specifically, we
detail phenomenology measurements in brownout environments, show developments in enabling component
technologies, and present results from a 30-element aperiodic array imager that has recently been fabricated.
A new technique for improvised explosive device (IED) creation uses an explosive device buried in foam and covered in
a layer of dirt. These devices are difficult to detect visually, however, their material characteristics make them detectable
by passive millimeter-wave (pmmW) sensors. Results are presented from a test using a mock IED and an outdoor set-up
consisting of two mock IEDs on a dirt background. The results show that the mock IEDs produces a millimeter-wave
signature which is distinguishable from the background surrounding the mock IEDs. Simulations based on the measured
data are presented and a design for a future vehicle mounted sensor is shown.
The polarization properties of radiation can contain additional information beyond what is available with only
an intensity measurement. A full-Stokes polarimeter is capable of measuring the four Stokes parameters which
completely characterizes the polarization of detected radiation. A division of time full-Stokes polarimeter often uses a
rotating polarizing element to measure all four Stokes parameters and this rotation can introduce artifacts due to
wobbling. In this paper a system is proposed which uses an electrically controlled phase bias instead of a rotating
element to create a full-Stokes polarimeter for a millimeter-wave system which utilizes optical up-conversion.
We report on our initial results of passive, real-time imaging in the Q-band using a distributed aperture and optical
upconversion. The basis of operation is collection of incident mmW radiation by the distributed aperture, as embodied
by an array of horn antennas, which is then amplified and upconverted to optical frequencies using commercially
available electro-optic modulators. The non-linear mixing of the modulators creates sidebands containing the mmW
signal with both amplitude and phase preserved. These signals are relaunched in the optical domain with a homothetic
mapping of the antenna array. The optical carrier is stripped via dielectric stack filters and imagery is synthesized from
the sidebands using the Fourier transform properties of a simple lens. This imagery is collected using a standard nearinfrared
camera with post-processing to enhance the signal of interest and reduce noise. Details of operation and
presentation of sample imagery is presented herein.
KEYWORDS: Imaging systems, Calibration, Sensors, Temperature metrology, Data modeling, Visualization, Antennas, Polarization, Absorption, Data acquisition
The unique ability of the millimeter-wave portion of the spectrum to penetrate typical visual obscurants has resulted in a
wide range of possible applications for imagers in this spectrum. Of particular interest to the military community are
imagers that can operate effectively in Degraded Visual Environments (DVE's) experienced by helicopter pilots when
landing in dry, dusty environments, otherwise known as "brownout." One of the first steps to developing operational
requirements for imagers in this spectrum is to develop a quantitative understanding of the phenomenology that governs
imaging in these environments. While preliminary studies have been done in this area, quantitative, calibrated
measurements of typical targets and degradation of target contrasts due to brownout conditions are not available. To
this end, we will present results from calibrated, empirical measurements of typical targets of interest to helicopter pilots
made in a representative desert environment. In addition, real-time measurements of target contrast reduction due to
brownout conditions generated by helicopter downwash will be shown. These data were acquired using a W-band,
dual-polarization radiometric scanner using optical-upconversion detectors.
Millimeter-wave (mmW) imaging is presently a subject of considerable interest due to the ability of mmW radiation to
penetrate obscurants while concurrently exhibiting low atmospheric absorption loss in particular segments of the
spectrum, including near 35 and 94 GHz. As a result, mmW imaging affords an opportunity to see through certain
levels of fog, rain, cloud cover, dust, and blowing sand, providing for situational awareness where visible and infrared
detectors are unable to perform. On the other hand, due to the relatively long wavelength of the radiation, achieving
sufficient resolution entails large aperture sizes, which furthermore leads to volumetric scaling of the imaging platform
when using conventional refractive optics. Alternatively, distributed aperture imaging can achieve comparable
resolution in an essentially two-dimensional form factor by use of a number of smaller subapertures through which the
image is interferometrically synthesized. The novelty of our approach lies in the optical upconversion of the mmW
radiation as sidebands on carrier laser beams using electro-optic modulators. These sidebands are subsequently stripped
from the carrier using narrow passband optical filters and a spatial Fourier transform is performed by means of a simple
lens to synthesize the image, which is then viewed using a standard near-infrared focal plane array (FPA).
Consequently, the optical configuration of the back-end processor represents a major design concern for the imaging
system. As such, in this paper we discuss the optical configuration along with some of the design challenges and
present preliminary imaging data validating the system performance.
Passive imaging using millimeter waves (mmWs) has many advantages and applications in the defense and security
markets. All terrestrial bodies emit mmW radiation and these wavelengths are able to penetrate smoke, blowing dust or
sand, fog/clouds/marine layers, and even clothing. One primary obstacle to imaging in this spectrum is that longer
wavelengths require larger apertures to achieve the resolutions typically desired in surveillance applications. As a
result, lens-based focal plane systems tend to require large aperture optics, which severely limit the minimum
achievable volume and weight of such systems. To overcome this limitation, a distributed aperture detection scheme is
used in which the effective aperture size can be increased without the associated volumetric increase in imager size.
However, such systems typically require high frequency (~ 30 - 300 GHz) signal routing and down conversion as well
as large correlator banks. Herein, we describe an alternate approach to distributed aperture mmW imaging using optical
upconversion of the mmW signal onto an optical carrier. This conversion serves, in essence, to scale the mmW sparse
aperture array signals onto a complementary optical array. The optical side bands are subsequently stripped from the
optical carrier and optically recombined to provide a real-time snapshot of the mmW signal. In this paper, the design
tradeoffs of resolution, bandwidth, number of elements, and field of view inherent in this type of system will be
discussed. We also will present the performance of a 30 element distributed aperture proof of concept imaging system
operating at 35 GHz.
Grayscale lithography is an extension of the conventional binary lithographic process for realization of arbitrary three-dimensional features in photoresist materials, with applications especially in micro-optics fabrication. The grayscale photomask possesses a spatially varying transmission that modulates the exposure dose received in the photoresist. By using a low contrast photoresist, such as those based on diazonaphthoquinone (DNQ), the material is only partially removed during development in proportion to the local exposure dose received. In this way, an arbitrary surface topography can be sculpted in the photoresist material. It is common practice in grayscale lithography to encode the transmission levels of the photomask by using the photoresist contrast curve to determine the exposure dose required for a given photoresist thickness at each lateral point in the pattern. This technique is adequate when the surface topography is slowly varying and the photoresist film is thin. However, it is inaccurate when these conditions are not met, because the technique essentially represents a one-dimensional approximation to the lithographic process where the isotropy of the development and the diffractive imaging of the photomask are neglected. Currently we are applying grayscale lithography to the fabrication of a fiber-to-waveguide coupler based on the parabolic reflector, where the efficiency of the device is quite sensitive to fabrication errors in the coupler geometry. In this case the thin photoresist and slowly varying topography conditions are not met, and we turn to more comprehensive process models to determine the appropriate transmission levels to encode in the photomask. We demonstrate that the photomask can be optimized, based on simulation of the lithography process, to produce the required three-dimensional photoresist pattern.
For some time, the micro-optics and photonics fields have relied on fabrication processes and technology borrowed from
the well-established silicon integrated circuit industry. However, new fabrication methodologies must be developed for
greater flexibility in the machining of micro-optic devices. To this end, we have explored grayscale lithography as an
enabler for the realization of such devices. This process delivers the ability to sculpt materials arbitrarily in three
dimensions, thus providing the flexibility to realize optical surfaces to shape, transform, and redirect the propagation of
light efficiently. This has opened the door for new classes of optical devices. As such, we present a fiber-to-waveguide
coupling structure utilizing a smoothly contoured lensing surface in the device layer of a silicon-on insulator (SOI)
wafer, fabricated using grayscale lithography. The structure collects light incident normally to the wafer from a singlemode
optical fiber plugged through the back surface and turns the light into the plane of the device layer, focusing it into
a single-mode waveguide. The basis of operation is total internal reflection, and the device therefore has the potential
advantages of providing a large bandwidth, low polarization sensitivity, high efficiency, and small footprint. The
structure was optimized with a simulated annealing algorithm in conjunction with two-dimensional finite-difference
time-domain (FDTD) simulation accelerated on the graphics processing unit (GPU), and achieves a theoretical efficiency
of approximately seventy percent, including losses due to Fresnel reflection from the oxide/silicon interface. Initial
fabrication results validate the principle of operation. We discuss the grayscale fabrication process as well as the
through-wafer etch for mechanical stabilization and alignment of the optical fiber to the coupling structure. Refinement
of the through-wafer etch process for high etch rate and appropriate sidewall taper are addressed.
Silicon photonics is an area of active research and commercial interest due in part to its leveraging of the existing mature
fabrication processes and infrastructure of the CMOS integrated circuit industry. Its suitability for use at the telecom
wavelengths, low cost, and compact devices enhance the value of silicon for photonics applications. One critical issue
that continues to be investigated is the efficient coupling of optical signals between the outside world and the photonic
chip, which is hampered by the large optical mode mismatch between the glass fiber and high index contrast silicon
waveguide. We introduce a new device that enables efficient coupling from the fiber to single mode silicon waveguide
called the vertical J-coupler, so named in reference to its parabolic shape. Grayscale lithography is used to fabricate the
three-dimensional topology of the coupler, enabled by the high energy beam sensitive (HEBS) glass grayscale
photomask. The principle of operation is total internal reflection, which is inherently polarization insensitive and
broadband. Electro-magnetic simulations validate the efficient operation of the device while experimental results
demonstrate its successful operation in coupling light into the silicon waveguide.
Under the DARPA COMP-I (Compressive Optical MONTAGE Photography Initiative) program, the goal of this project is to significantly reduce the volume and form factor of infrared imaging systems without loss of resolution. The approach taken is to use an array of small lenses with extremely short focal lengths rather than the conventional approach of a single aperture lens system with large diameter and focal length. The array of lenses creates multiple copies of the scene on a single focal plane detector array, which are then used to reconstruct an image with resolution comparable to or higher than that of the conventional imaging system. This is achieved by a computational method known as super-resolution reconstruction. Work at the University of Delaware towards this end includes participation in the design and optimization of the optical system along with fabrication of some of the optical elements. Grayscale lithography using a high-energy beam sensitive (HEBS) glass photomask and proportional dry etch pattern transfer are the key techniques enabling the fabrication process. In this paper we will discuss the design of the imaging system while focusing on the fabrication aspects of the project.
KEYWORDS: Photonic crystals, Etching, Silicon, Gallium nitride, Waveguides, Near field scanning optical microscopy, Near field, Near field optics, Nanolithography, Electron beam lithography
In this paper we demonstrate the design, fabrication, and characterization of a near-field photonic crystal nano-probe. By exploiting the ability of photonic crystals to strongly confine and guide light, we are able to produce optical spot sizes that are well below the diffraction limit. This offers in particular the advantage of higher resolution as compared to conventional optical probing techniques, while retaining the desirable features of speed, non-invasiveness, reliability, and low cost. Such a device has applications in scanning near-field optical microscopy (SNOM), nanolithography, high density optical data storage, and many other technologies. We describe the implementation of a photonic crystal device in the silicon-on-insulator (SOI) platform, as well as discuss progress being made in gallium-based alloys to further reduce the wavelength of operation and therefore the probe spot size.
KEYWORDS: Near field scanning optical microscopy, Optical microscopy, Near field optics, Microscopy, Photonic crystals, Applied physics, Near field, Nanolithography, Waveguides, Optical storage
In this paper we present the implementation, optimization and commercialization of an ultra-high resolution nano-probe in near field scanning optical microscopy/ spectroscopy (SNOM), nanolithography and high density optical data storage. The theme underlying this effort is the ability to examine or be able to write and/or read ultra fine feature sizes using near field based nano probes. The reason for pursing such research lies in the opportunities it offers for extending the applications of conventional optical microscopy into the nano meter scale domain. Furthermore near-field optical imaging preserves the inherent polarizing, non-invasive, spectroscopic and high temporal resolving capabilities of conventional microscopy, which are absent from other high-resolution techniques
Microprocessor performance is now limited by the poor delay and bandwidth performance of the on-chip global wiring layers. Although relatively few in number, the global metal wires have proven to be the primary cause of performance limitations - effectively leading to a premature saturation of Moore's Law scaling in future Silicon
generations. Building upon device-, circuit-, system- and architectural-level models, a framework for performance evaluation of global wires is developed aimed at quantifying the major challenges faced by intrachip global communications over the span of six technology generations. This paper reviews the status of possible intra-chip optical interconnect solutions in which the Silicon chip's global metal wiring layers are replaced with a high-density guided-wave or free-space optical interconnection fabric. The overall goal is to provide a scalable approach that is compatible with established silicon chip fabrication and packaging technology, and which can extend the reach of Moore's Law for many generations to come. To achieve the required densities, the integrated sources are envisioned to be modulators that are optically powered by off-chip sources. Structures for coupling dense modulator arrays to optical power sources and to free-space or guide-wave optical global fabrics are analyzed. Results of proof-of-concept experiments, which demonstrate the potential benefits of ultra-high-density optical interconnection fabrics for intra-chip global communications, are presented.
This paper focuses on the integration of InGaAsSb photodetectors along with micro-optics in order to realize a prototype system that can achieve a stronger response during atmospheric profiling and spectroscopy measurements. The integration of the detector was executed using a novel conductive-adhesive-based flip-chip integration process. The design, fabrication, and integration of the constituent technologies and experimental results from their characterization are presented.
Micro-optics offers the ability to realize massively parallel, surface-normal interconnects at the chip scale. In this context, we investigate the integration of a 10-Gbytes/s, 850-nm vertical-cavity surface-emitting laser (VCSEL) with a 2×2 array of continuous surface profile, diffractive optical elements to demonstrate a prototype system that incorporates 3-D, highly dense, parallel optical interconnects. The integration is achieved using a novel conductive polymer-based flip-chip process, which is implemented using conventional fabrication techniques. We present experimental results from the design, fabrication, integration, and characterization of the prototype system.
For highest efficiency optical devices, it is desirable to form continuously graded device features. We describe a technique to produce such features through the fabrication of a continuous-tone grayscale mask and subsequent grayscale photolithography. The design of the mask fabrication process is outlined, including the high-energy-beam-sensitive (HEBS) glass electron-beam exposure response characterization, and the generation of an exposure profile with inherent proximity effect correction. Application of the process is demonstrated through fabrication of smooth-facet retroreflectors, with features that are not possible to produce either by a grayscale process that employs discreet gray levels or by anisotropic wet etch techniques.
The 2-2.5μm region of the electromagnetic spectrum is of particular importance for the non-invasive monitoring of blood glucose using absorption spectroscopy, since it can provide the strongest signature as compared to other water transmission windows. Currently available spectroscopy systems for this application require high-gain and low-noise detectors in order to achieve sufficient signal-to-noise ratio measurements. In this context, we are investigating the integration of micro-optics along with InGaAsSb/AlGaAsSb avalanche photodetectors in order to demonstrate high-fill factor, high quantum efficiency and eventually the ability to evaluate the blood glucose concentration with high accuracy. Also, using the bandgap engineering options afforded by the quaternary antimonide structures, the spectral response of the detector can be tuned over this wavelength range. In this paper, we present the design, fabrication and integration of the multi-chip modules, the constituent technologies required to realize them and experimental results from their characterization.
While recent advances in optical integrated circuits and photonic crystal devices have been impressive, there presently exists an unsatisfied need for an efficient means of coupling into these systems from the outside world. To this end, we have developed writing techniques for continuous-tone grayscale masks in high-energy
beam sensitive (HEBS) glass, which we subsequently employ in the fabrication of tapered coupling devices. These devices demonstrate efficient coupling of free-space and fiber signals into waveguides
fabricated on silicon-on-insulator substrates. This approach significantly reduces losses as compared to standard butt-coupling and end-fire coupling methods, in addition to being inherently broadband. In this paper, we discuss grayscale mask process development, fabrication techniques for the coupling devices, and
characterization of device performance.
Diffractive optical elements (DOEs) offer the ability to boost fill factors of high-speed (field-of-view limited) near-infrared detectors. In this context, we have investigated the design and fabrication of a system that involves integration of DOEs with avalanche photodetectors (APDs). These APDs are implemented in the antimonide material system for operation around a 2.1-μm wavelength. Consequently, such systems could be used to reduce the required threshold power at free-space photonic receivers. To this end, we present the design and fabrication technologies for the DOEs, APDs, and their integration using polymer-based flip-chip interconnections.
In this paper we present the development of several new and novel fabrication methods for the realization of two-dimensional photonic crystal devices in silicon slab waveguides. We begin by presenting a process for the fabrication of high fill-factor devices in silicon-on-insulator wafers. Next, we present a grayscale fabrication process for the realization of three-dimensional silicon structures, such as tapered horn couplers. We then present the fabrication of suspended silicon slabs using a co-polymer process based on direct write electron beam lithography and silicon sputtering. And lastly, we conclude by presenting an alternate method for realizing PhC devices in a silicon slab based on a combination of wet and dry etching processes in bulk silicon wafers.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.