High fidelity simulation of active tracking systems requires the integration of optical, imaging, control and structural models for the payload with representative target and engagement models. The dependent relationship of actively illuminated target return on beam control system pointing commands particularly motivates an integrated simulation. Simulations of tracking and pointing systems have been developed to allow system algorithm refinement and performance prediction for several acquisition, tracking and pointing experiments. The simulation includes high fidelity, two axis control system models, 2D imaging sensor models, and 3D target geometry and reflectivity models. Key issues addressed include the effects of illuminator jitter coupling into the track error estimates, speckle effects, target reflectivity variations and control system interactions on residual pointing errors. The simulation has been implemented with commercial PC-class hardware and signal processing tools, using databased for specific target geometry and reflectivity maps as a function of engagement timelines. The simulation approach makes it particularly easy for control system and sensor system engineers to integrate discipline-specific models into a system simulation.
Requirements for image resolution can be used to set upper limits on the allowable line-of-site (LOS) motion of an acquisition, tracking, and pointing (ATP) system. Image resolution is important for image-based tracking algorithms and for typical ancillary requirements for target phenomenology data gathering. During the system design phase of an ATP platform, base-motion- disturbance details such as total rms power and spectral distribution of this power may not be known for primary disturbance sources such a gimbals, cooling systems, and steering mirrors. In this case, setting upper limits to allowable LOS jitter is an important criteria in the trade study analyses for these components. The effect of jitter is frequency dependent and can be partitioned into regimes based on the image sample rate of the system. The application of image-resolution requirements for the High Altitude Balloon Experiment are used to set allowable LOS motion for random, sinusoidal, and linear disturbances. Three frequency regimes are identified with different allowable-motion amplitudes. This top-level systems methodology can be applied to many imaging applications such as estimating the blur induced by wind loading of ground based telescopes.
The high altitude balloon experiment (HABE) is developing technologies important to resolving critical acquisition, tracking and pointing issues. An introduction to the technology papers in this session is given along with an update on the progress of the experiment. Contributions in the areas of controls, sensors, optics, systems analysis, and mission planning are presented along with progress in system and subsystem testing.
The mission of the High-Altitude Balloon Experiment (HABE) is to acquire supporting data, validate enabling technologies, and resolve critical acquisition, tracking, and pointing (ATP) and fire control issues in support of future space-based precision pointing experiments. The use of high-altitude balloons offers a relatively low-cost, low-vibration test platform, a recoverable and reusable payload, worldwide launch capability, and a 'near- space' emulation of the future space systems operational scenarios. The HABE platform design is based on several previous spacecraft designs, and includes coarse gimbal pointing, infrared and visible passive tracking, active fine tracking, internal auto alignment and boresighting, and precision line-of-sight (LOS) stabilization functions. A broad overview of the HABE balloon and payload system is presented, and the similarities and differences between high-altitude balloon and spacecraft design approaches are discussed. The special design features and operational conditions for ATP experiments aboard high-altitude balloon platforms are reviewed with HABE used as a design reference.
Extensive research has shown that including target aspect angle measurements from an optical sensor can significantly improve the performance of radar tracking systems. Integrating sequences of target imagery with the kinematic information involves sets of image processing and sensor data fusion algorithms. A workstation has been developed to expedite the analysis of the algorithms and to integrate the image processing with selectable extended-state tracker modules. This workstation can access analog video imagery from a video optical disk controlled by a PC, segment the target in the image, and perform target identification and aspect angle estimation using a database of target models which span the range of possible aspects. The angle information is then `fused' with kinematic data to augment the tracker state estimator. The workstation is implemented with a powerful visual user interface in a UNIX/X- Windows environment, and includes a wide array of image and signal processing algorithms. Interactive modifications of processing sequences and `what if' analyses are easily conducted. The workstation provides a consistent user interface across a variety of applications. This system has also been used to implement phase retrieval and related image recovery algorithms.
High-resolution space-based imaging applications are limited by the difficulty of placing large monolithic mirrors in space and by technology limitations on the diameter achievable in monolithic mirrors. Multiple-mirror imaging systems can overcome these limitations but require precise alignment-error sensing and correcting schemes to maintain all elements in phase. When a wide field of view is desired, the complexity increases substantially since significant error terms will be a function of field angle. Approaches which can reduce the complexity of the error sensing/correcting schemes are thus of great interest. By sampling selected spatial frequencies, representative of both the individual subapertures and errors between subapertures, measurement of all error terms except absolute piston can be achieved. A technique which places a nonredundant mask in the compacted pupil plane of a phased-array imager and senses the selected spatial-frequency magnitude and phase in the focal plane has been analyzed. This technique can reduce complexity in the local error-sensing system while accounting for all tilt, geometry, magnification, and relative piston errors.