Delivery of large volumes of data from low-Earth orbit to ground is challenging due to the short link durations associated with direct-to-Earth links. The short ranges that are typical for such links enable high data rates with small terminals. While the data rate for radio-frequency links is typically limited by available spectrum, optical links do not have such limitations. However, to date, demonstrations of optical links from low-Earth orbit to ground have been limited to ~10 to ~1000 Mbps. We describe plans for NASA’s TeraByte InfraRed Delivery (TBIRD) system, which will demonstrate a direct-to-Earth optical communication link from a CubeSat in low-Earth orbit at burst rates up to 200 Gbps. Such a link is capable of delivering >50 Terabytes per day from a small spacecraft to a single small ground terminal.
In recent years, NASA has been developing a scalable, modular space terminal architecture to provide low-cost laser communications for a wide range of near-Earth applications. This development forms the basis for two upcoming demonstration missions. The Integrated Low-Earth Orbit Laser Communications Relay Demonstration User Modem and Amplifier Optical Communications Terminal (ILLUMA-T) will develop a user terminal for platforms in low-Earth orbit which will be installed on the International Space Station and demonstrate relay laser communications via NASA’s Laser Communication Relay Demonstration (LCRD) in geo-synchronous orbit. The Orion EM-2 Optical Communication Demonstration (O2O) will develop a terminal which will be installed on the first manned launch of the Orion Crew Exploration Vehicle and provide direct-to-Earth laser communications from lunar ranges. We describe the objectives and link architectures of these two missions which aim to demonstrate the operational utility of laser communications for manned exploration in cislunar space.
In a multiplexed image, multiple fields-of-view (FoVs) are superimposed onto a common focal plane. The attendant gain in sensor FoV provides a new degree of freedom in the design of an imaging system, allowing for performance tradeoffs not available in traditional optical designs. We explore design choices relating to a shift-encoded optically multiplexed imaging system and discuss their performance implications. Unlike in a traditional imaging system, a single multiplexed image has a fundamental ambiguity regarding the location of objects in the image. We present a system that can shift each FoV independently to break this ambiguity and compare it to other potential disambiguation techniques. We then discuss the optical, mechanical, and encoding design choices of a shift-encoding midwave infrared imaging system that multiplexes six 15×15 deg FoVs onto a single one megapixel focal plane. Using this sensor, we demonstrate a computationally demultiplexed wide FoV video.
We describe the optical design and characterization testing of an optically multiplexed imaging system operating in the 3.4 to 5 micron waveband. The optical design uses a division of aperture method to overlay six images on a single focal plane and produce a 90 by 15 degree 6-megapixel field of view. Image disambiguation is achieved through image shifting enabled by piezo-actuated mirrors in the multiplexing assembly. This paper provides an overview of the optical design including focal plane selection, image resolution and distortion, pupil imaging, and aperture division geometry. A method of applying one and two-point non-uniformity correction using radiometric test data is suggested. Sensor-level per-channel image quality and sensitivity tests including MTF, 3D-noise and NEDT are shown to validate the design assumptions.
Optically multiplexed imagers overcome the tradeoff between field of view and resolution by superimposing images from multiple fields of view onto a single focal plane. In this paper, we consider the implications of independently shifting each field of view at a rate exceeding the frame rate of the focal plane array and with a precision that can exceed the pixel pitch. A sequence of shifts enables the reconstruction of the underlying scene, with the number of frames required growing inversely with the number of multiplexed images. As a result, measurements from a sufficiently fast sampling sensor can be processed to yield a low distortion image with more pixels than the original focal plane array, a wider field of view than the original optical design, and an aspect ratio different than the original lens. This technique can also enable the collection of low-distortion, wide field of view videos. A sequence of sub-pixel spatial shifts extends this capability to allow the recovery of a wide field of view scene at sub-pixel resolution. To realize this sensor concept, a novel and compact divided aperture multiplexed sensor, capable of rapidly and precisely shifting its fields of view, was prototyped. Using this sensor, we recover twenty-four megapixel images from a four-megapixel focal plane and show the feasibility of simultaneous de-multiplexing and super-resolution.
Volume holographic imaging (VHI) utilizes the Bragg selectivity of volume holograms to achieve 3D optical slicing. The depth resolution of VHI degrades quadratically with increasing object distance like most 3D imaging systems. We have devised an imaging scheme that takes advantage of the superior lateral resolution of VHI and a-priori surface information about the object to build a profilometer that can resolve 50 μm features at a working distance of ≈ 50 cm. We discuss the scheme and present experimental results of surface profiles of MEMS devices.