This paper develops equations for the bias and random noise in the signal estimates from both non-frame transfer CCDs
and frame transfer CCDs having overclock rows that are used to estimate and eliminate the bias due to image smear.
The paper also reports on numerical experiment estimates of signal, bias, and random noise obtained from computer
simulation of the charge generation, charge transfer, and signal processing steps employed with these CCDs to obtain
signal estimates. The theoretical predictions of the exact equations are checked against the experimental results of the
simulation and found to be in close agreement.
Using the analytical formula, the magnitude of the smear bias in the signal estimate for a non frame transfer CCD is
compared to the magnitude of the true minimum signal for CCD operating parameters similar to the ones in a unique
Earth remote sensing application. The bias error due to smear is found to be huge, 1.3 times the magnitude of the true
signal. For the same operating conditions, the total random noises are compared for this CCD and one having overclock
rows to eliminate bias. The random noise of the frame transfer CCD with overclock rows is only 1 electron greater than
the random noise of the non frame transfer CCD. Thus, using the frame transfer CCD with overclock rows to eliminate bias, the additional random noise error incurred is minimal compared to the error from the eliminated bias.
We have manufactured rad-hard, InGaAs photodiodes using our proprietary Dual-Depletion Region (DDR) technology with bandwidths exceeding 10 GHz. The devices demonstrate high reliability and superior RF performance, thus, making them ideal for deployment in space for applications such as LIDAR and optical intersatellite links. The responsivity at 1064 nm is >0.45 A/W with optical return loss of 40 dB. The photodiodes have broad wavelength coverage from 800 nm to 1700 nm, and thus can be used at several wavelengths such as 850 nm, 1064 nm, 1310 nm, 1550 nm, and 1620 nm. The InGaAs photodiodes exhibit very low Polarization Dependence Loss (PDL) of 0.05 dB typical to 0.1 dB maximum. The typical Failure-in-time (FIT) values of these photodiodes are 0.011 and 15.384 at 25°C and 75°C respectively. FIT is defined as the number of failures per billion hours of operation. The photodiodes have been tested for different radiation tests such as 50 kRad gamma (Co 60) and protons with a fluence of 3 x 1011 p/cm2, and have passed typical qualification levels of random vibration.
The unique ability to record photon X,Y,T high fidelity information has advantages for high speed recording devices for some important time dependent applications. For microchannel plate sensors our most commonly used readout configuration is the cross delay line anode. We have achieved resolutions of < 25 μm in tests over 65 mm x 65 mm (>2.5k x 2.5k resolution elements) with excellent linearity for random photon rates of > 500 kHz, while time tagging events using the MCP output signal to better than 100 ps. Open face and sealed tube microchannel plate cross delay line detectors of this kind have been built and used for observation of flare stars, orbital satellites and space debris with the GALEX satellite, time resolved imaging of the Crab Pulsar with a telescope as small as 1m, biological fluorescence imaging and synchrotron diagnostics. To achieve better efficiency, higher counting rate and extended lifetime we are now developing cross strip anode readouts. These have already demonstrated 5μm resolution at <10x lower gain than the cross delay line schemes, and high speed electronics for the cross strip are currently in development.
ITT Industries Space Systems Division and Eastman Kodak Company have developed a scalable, data- and power-efficient imaging spectrometer system with a digitally tunable optical filter capability, which enables the rapid selection of high-quality user-defined optical spectral band(s) of interest. The system utilizes a custom-designed, high-contrast diffractive MEMS device with 50 independent spectral switches at the image plane of a double-pass dispersive/de-dispersive
spectrometer. The custom MEMS device is based on grating electromechanical system (GEMS) display technology, which provides very high image contrast (2000:1), fast optical switching speeds (< 100 ns), and a large active area with a very high fill factor. The system enables the selection of arbitrary, narrow or wide spectral bands of interest across the visible spectrum with a sampling resolution of 5 nm, without any moving mechanical parts. The
resulting optical filter quality and performance is comparable to conventional fixed-band dichroic filters used in current remote sensing systems. The brassboard systems are designed for rapid transition to space-based, electro-optical (EO) remote sensing missions that utilize large format linear TDI scanning sensors and large format area staring arrays in the visible band. This technology addresses numerous capabilities to meet future EO system requirements for rapidly selecting and utilizing a high quality imaging optical bandpass of interest. The system concept provides capability for a
>20X scan rate advantage over conventional hyperspectral imagers as a result of the compatibility with TDI scanning. The image quality is comparable to current MSI and HSI systems.
High-finesse Fabry-Perot interferometers are useful tools when it comes to high-resolution spectroscopy and narrow-band filtering. Rugged, reliable, narrow-band tunable filters are of special interest for remote sensing from ground, airborne and space-based platforms. This report discusses the results of a numerical analysis and experimental study of such a filter with its design based on the unique features of the photonic crystal interferometer (PCI). An optimal choice of the PCI components and their parameters allows achieving a sub-angstrom bandpass combined with a broad tunability range (< 10 nm) in the visible spectral regions. The prototype PCI system has the tunability over a 1 nm with a bandpass near 0.02 nm and an acceptance angle about 1 angular degree. Detailed consideration is given to the imaging characteristics of such an interferometer, and their dependence upon the quality of the individual components (i.e. the mirror substrates, dielectric layers, etc.). In summary, we present the results of the theoretical analysis and experimental study of the spectral and imaging characteristics of a high-finesse PCI, and their impairments due to deviations from the required parameters of the optical elements.
To meet evolving ballistic missile threats, advanced seekers will include a multi-modal imaging capability in which a passive single- or multi-band infrared focal plane array (FPA) shares a common aperture with an active laser radar (LADAR) receiver - likely, a photon-counting LADAR receiver that can resolve photon times of arrival with sub-nanosecond resolution. The overall success of such a system will depend upon its photon detection efficiency and sensitivity to upset by spurious detection events. In the past, to perform photon counting functions, it has generally been necessary to operate near infrared (NIR) avalanche photodiode (APD) FPAs in Geiger Mode. Linear Mode APDs could not provide enough proportional gain with sufficiently low noise to make the photocurrent from a single photon detectible using existing amplifier technology. However, recent improvements in APDs, sub-micron CMOS technology, and concomitant amplifier designs, have made Linear Mode single-photon-counting APDs (SPADs) possible. We analyze the potential benefits of a LADAR receiver based on Linear Mode SPADs, which include: 1) the ability to obtain range information from more than one object in a pixel's instantaneous-field-of-view (IFOV), 2) a lower false alarm rate, 3) the ability to detect targets behind debris, 4) an advantage in the endgame, when stronger reflected signals allow dark current rejection via thresholding, and 5) the ability to record signal intensity, which can be used to increase kill efficiency. As expected, multiple laser shots of the same scene improves the target detection probability.
Neptec has developed a vision system for the capture of non-cooperative objects on orbit. This system uses an active TriDAR sensor and a model based tracking algorithm to provide 6 degree of freedom pose information in real-time from mid range to docking. This system was selected for the Hubble Robotic Vehicle De-orbit Module (HRVDM) mission and for a Detailed Test Objective (DTO) mission to fly on the Space Shuttle.
TriDAR (triangulation + LIDAR) technology makes use of a novel approach to 3D sensing by combining triangulation and Time-of-Flight (ToF) active ranging techniques in the same optical path. This approach exploits the complementary nature of these sensing technologies. Real-time tracking of target objects is accomplished using 3D model based tracking algorithms developed at Neptec in partnership with the Canadian Space Agency (CSA). The system provides 6 degrees of freedom pose estimation and incorporates search capabilities to initiate and recover tracking. Pose estimation is performed using an innovative approach that is faster than traditional techniques. This performance allows the algorithms to operate in real-time on the TriDAR's flight certified embedded processor.
This paper presents results from simulation and lab testing demonstrating that the system's performance meets the requirements of a complete tracking system for on-orbit autonomous rendezvous and docking.
Researchers at the Michigan Aerospace Corporation have developed accurate and robust 3-D algorithms for pose determination (position and orientation) of satellites as part of an on-going effort supporting autonomous rendezvous, docking and space situational awareness activities. 3-D range data from a LAser Detection And Ranging (LADAR) sensor is the expected input; however, the approach is unique in that the algorithms are designed to be sensor independent. Parameterized inputs allow the algorithms to be readily adapted to any sensor of opportunity. The cornerstone of our approach is the ability to simulate realistic range data that may be tailored to the specifications of any sensor. We were able to modify an open-source raytracing package to produce point cloud information from which high-fidelity simulated range images are generated. The assumptions made in our experimentation are as follows: 1) we have
access to a CAD model of the target including information about the surface scattering and reflection characteristics of the components; 2) the satellite of interest may appear at any 3-D attitude; 3) the target is not necessarily rigid, but does have a limited number of configurations; and, 4) the target is not obscured in any way and is the only object in the field of view of the sensor. Our pose estimation approach then involves rendering a large number of exemplars (100k to 5M), extracting 2-D (silhouette- and projection-based) and 3-D (surface-based) features, and then training ensembles of decision trees to predict: a) the 4-D regions on a unit hypersphere into which the unit quaternion that represents the
vehicle [QX, QY, QZ, QW] is pointing, and, b) the components of that unit quaternion. Results have been quite promising and the tools and simulation environment developed for this application may also be applied to non-cooperative spacecraft operations, Autonomous Hazard Detection and Avoidance (AHDA) for landing craft, terrain mapping, vehicle guidance, path planning and obstacle avoidance.
NASA's initiative for space exploration will require the development of robotic servicing and unmanned resupply of permanent space borne facilities. An enabling technology to accomplish these goals is by sensor systems capable of Rendezvous, Proximity Operations and Docking (RPOD) missions.
Marshall Space Flight Center (MSFC) conducted an experiment whose objective intent was to characterize sensor systems for potential use in RPOD scenarios. The MSFC experiment integrated candidate sensors with the Small Air Sled (SAS) on the air bearing floor of the MSFC Flight Robotics Lab.
Advanced Optical Systems Inc. (AOS) has developed several different sensor technologies for Automated Rendezvous and Docking (AR&D). For the MSFC experiment, we applied AOS ULTOR advanced correlation technology as an AR&D sensor. The ULTOR system applied Automatic Target Recognition (ATR) algorithms to provide six-degrees-of-freedom (6DOF) information for target position and attitude. In addition, ULTOR provided a data-link interface to the SAS for closed loop guidance and navigation commands.
Navigational data from the ULTOR system was collected during the experiment and compared to a MSFC truth sensor for position and attitude estimation accuracy. This data will be presented as well as videos recording the progression of the SAS under ULTOR control to the target.
Structured light illumination refers to a technique of acquiring 3-D surface scans through triangulation between a camera and a projector. Because traditional structured-light systems use multiple patterns projected sequentially in time, SLI is not typically associated with applications involving moving surfaces. To address this problem, the authors have introduced a technique referred to as composite pattern projection which involves the combining of a set of standard SLI patterns into a continuously projected pattern such that depth can be recovered from a single, captured image. As such, composite patterns can be used for tracking moving objects in 3-D space. The problem with composite patterns, though, is the added computational complexity associated with demodulating the captured image and extract the component SLI patterns. So in this paper, we introduce a means of achieving real-time pattern demodulation through the use of optical correlators with demonstrated results achieving a processing rate of over 100 frames per second.
The Space Shuttle Program requires on-orbit inspection of the thermal protection system which covers the Orbiter spacecraft, including the critical leading-edge surfaces. A scannerless ladar system mounted on a 50-foot boom extension of the robotic arm provides this capability. This paper describes the sensor and ground processing system, which were developed by Sandia National Laboratories to meet the requirements of the Return to Flight mission in July of 2005. Mission operations for this sensor system are also reviewed.
NASA contracted Neptec to provide the Laser Camera System (LCS), a 3D scanning laser sensor, for the on-orbit inspection of the Space Shuttle's Thermal Protection System (TPS) on the return-to-flight mission STS-114. The scanner was mounted on the boom extension to the Shuttle Remote Manipulator System (SRMS). Neptec's LCS was selected due to its close-range accuracy, large scanning volume and immunity to the harsh ambient lighting of space.
The crew of STS-114 successfully used the LCS to inspect and measure damage to the Discovery Shuttle TPS in July, 2005. The crew also inspected the external-tank (ET) doors to ensure that they were fully closed. Neptec staff also performed operational support and real-time detailed analysis of the scanned features using analysis workstations at Mission Control Center (MCC) in Houston. This paper provides a summary of the on-orbit scanning activities and a description of the results detailed in the analysis.
LIDAR-based systems measure the time-of-flight of a laser source onto the scene and back to the sensor, building a wide
field of view 3D raster image, but as a scanning process, there are problems associated with motion inside the scene over
the duration of the scan. By illuminating the entire scene simultaneously using a broad laser pulse, a 2D camera
equipped with a high speed shutter can measure the time-of-flight over the entire field of view (FOV), thereby, recording
an instantaneous snap-shot of the entire scene. However, spreading the laser reduces the range. So what is required is a
programmable system that can track multiple regions of interest by varying the field of regard to (1) a single direction, (2)
the entire FOV, or (3) intermediate views of interest as required by the evolving scene environment. In this project, the
investigators intend to add this variable illumination capability to existing instantaneous ranging hardware by using a
liquid crystal spatial light modulator (SLM) beam steering system that adaptively varies the (single or multi) beam
intensity profiles and pointing directions. For autonomous satellite rendezvous, docking, and inspection, the system can
perform long-range sensing with a narrow FOV while being able to expand the FOV as the target object approaches the
sensor. To this end in a previous paper, we analyzed the performance of a commercially available TOF sensor
(3DVSystems' Zmini) in terms of the depth sensitivity versus target range and albedo. In this paper, we will analyze the
laser system specifications versus range of field-of-view when beam steering is performed by means of a Boulder
Nonlinear Systems' phase-only liquid crystal SLM. Experimental results show that the adjustable laser beam FOV
extensively compensate the reflected image grayscale from objects at long range, and prove the feasibility of expanding
range with the projection from the SLM.
Autonomous rendezvous and docking has become more prominent in the wake of the DART mission. In support of AR&D, NASA and companies such as ours have been developing sensors to measure distance, bearing and pose to target spacecraft. We are developing a suite of such sensors. The sensors include the Advanced Video Guidance Sensor (AVGS), the ULTOR video processor, and the Wide Angle Lidar for Direction and Distance (WALDD). AVGS is a laser-based video sensor that images retro-reflecting targets and extracts six-degree-of-freedom information. WALDD is a staring lidar system that provides range and bearing information using retro-reflecting targets. ULTOR is a video processor that can extract six-degree-of-freedom information from spacecraft that lack special targets. We will give an overview of the three sensors, their development, and their capabilities.
SUMO, the Spacecraft for the Universal Modification of Orbits, is a DARPA-sponsored spacecraft designed to provide orbital repositioning services to geosynchronous satellites. Such services may be needed to facilitate changing the geostationary slot of a satellite, to allow a satellite to be used until the propellant is expended instead of reserving propellant for a retirement burn, or to rescue a satellite stranded in geosynchronous transfer orbit due to a launch failure. Notably, SUMO is being designed to be compatible with the current geosynchronous satellite catalog, which implies that it does not require the customer spacecraft to have special docking fixtures, optical guides, or cooperative communications or pose sensors. In addition, the final approach and grapple will be performed autonomously. SUMO is being designed and built by the Naval Center for Space Technology, a division of the U.S. Naval Research Laboratory in Washington, DC. The nature of the SUMO concept mission leads to significant challenges in onboard spacecraft autonomy. Also, because research and development in machine vision, trajectory planning, and automation algorithms for SUMO is being pursued in parallel with flight software development, there are considerable challenges in prototyping and testing algorithms in situ and in transitioning these algorithms from laboratory form into software suitable for flight. This paper discusses these challenges, outlining the current SUMO design from the standpoint of flight algorithms and software. In particular, the design of the SUMO phase 1 laboratory demonstration software is described in detail. The proposed flight-like software architecture is also described.
An overview of the sensor assembly for an upcoming responsive space demonstration is provided. The "top down" methodology establishing the design baselines, including the sensor array selection, is described in the context of a responsive space payload. The detailed design is then presented. Finally, new data products obtained with the engineering and flight model assemblies for the imager are analyzed and discussed.
A method is derived from Kepler's laws of motion allowing the determination of slant range for orbiting targets
given monocular angles-only measurements. The method is shown to work without knowledge of three parameters:
universal gravitational constant, mass of the central body, and time scale. Monte Carlo trials with noisy data sets,
however, show that the method is much more sensitive to measurement noise than competing methods that require
knowledge of these parameters.
A near space, high-altitude balloon mission (BalloonWinds) is planned to demonstrate the performance of a direct detection wind LIDAR instrument. The program is a NOAA-funded initiative to demonstrate direct detection, fringe imaging Doppler Wind LIDAR (Light Detection and Ranging) technology. BalloonWinds will involve a series of high altitude missions (~30km), each lasting 8-10 hours, scheduled for launch in 2006 to validate wind LIDAR technology from a near space platform. With the promise of responsive, affordable launch vehicles and near space platforms, there is an opportunity to demonstrate launch-on-demand capability of low-cost instruments that can provide regional or global wind data. It has been well established that direct measurement of winds will improve weather forecasting accuracy and hurricane landfall prediction and would provide benefits to government agencies and the public at large. An overview of the BalloonWinds instrument design and near space flight plan is presented in this paper as well as a concept design for a low-cost, 6-12 month space mission. Instrument performance simulations are used to demonstrate the feasibility and effectiveness of the low-cost approach for global wind sounding compared to traditional mission concepts.
The Air Force Research Laboratory's Space Vehicles Directorate (AFRL/VS) and the Department of Defense Space Test Program (STP) are two organizations that have partnered on more than 85 missions since 1968 to develop, launch, and operate Research and Development, Test and Evaluation space missions. As valuable as these missions have been to the follow-on generation of Operational systems, they are consistently under-funded and forced to execute on excessively ambitious development schedules. Due to these constraints, space mission development teams that serve the RDT&E community are faced with a number of unique technical and programmatic challenges. AFRL and STP have taken various approaches throughout the mission lifecycle to accelerate their development schedules, without sacrificing cost or system reliability. In the areas of test and operations, they currently employ one of two strategies. Historically, they have sought to avoid the added cost and complexity associated with coupled development schedules and segregated the spacecraft development and test effort from the ground operations system development and test effort. However, because these efforts have far more in common than they have differences, they have more recently attempted to pursue parallel I&T and Operations development and readiness efforts. This paper seeks to compare and contrast the "decoupled test and operations" approach, used by such missions as C/NOFS and Coriolis, with the "coupled test and operations" approach, adopted by the XSS-11 and TacSat-2 missions.
One of the most costly components of the on-orbit operation of a spacecraft is the people that
execute the mission. Historically, for Air Force Research Laboratory (AFRL) and the
Department of Defense Space Test Program (STP) research and development, test and evaluation
(RDT&E) space missions, a team of fifteen personnel maintains 24-hour coverage for the three-week
Launch and Early Operations (L/EO) phase of the mission and four one-week L/EO rehearsals. During the
Nominal Operations phase of the mission, 2.5 "man-days" of support are necessary each day that the
spacecraft remains on-orbit, as well as during the two, week-long, nominal operations rehearsals.
Therefore, the mission-dedicated personnel contribution to the cost of a one-year mission is more than
eleven man-years, and this does not include the personnel that actually operate the antennas at the various
remote ground facilities or develop and maintain the mission-specific or shared-use ground network,
hardware, and software. In the low-budget RDT&E world, hardware, software, or Concept of Operations
(CONOPS) developments that significantly reduce the necessary Operations personnel investment can
mean the difference between a mission that does or does not survive. This paper explores the CONOPS
and suite of tools that the TacSat-2 program has put together to achieve maximum mission effectiveness at
minimum manpower cost.