In June 1994, the U.S. Navy and U.S. Marine Corps restructured the F/A-18D (RC) ATARS program. McDonnell Douglas Aerospace was selected as the prime contractor responsible for overall program management and integration of the ATARS with the F/A-18D (RC) weapon system. Loral Fairchild Systems (LFS) was selected as the subcontractor responsible for ATARS subsystem development and integration. The first part of the restructured program was called Element One and was a ten month risk reduction program ending in April 1995. The primary objective of Element One was to resolve the critical ATARS anomalies found during previous flight testing. In order to resolve these anomalies, MDA and LFS concurred that additional flight test data was required. In November/December 1994, eight flights were flown with a high fidelity instrumentation system to gather the additional data required for evaluation of the anomalies. Using this data, along with previously collected data, design 'fixes' were implemented to correct several of the system anomalies. In April 1995 and May 1995, approximately four flight tests were conducted to evaluate the performance of these design 'fixes.' A period of two months for flight tests was planned for July-August 1995 to complete this evaluation. This presentation describes the testing and results through May 1995 of the Element One flight test program.
Recon/Optical, Inc. (ROI) has pioneered the electro-optical (E-O) framing generation of sensors with the CA-260, a KS-87 form/fit camera with a wafer-scale focal plane array (FPA) containing a patented, on-chip, forward motion compensation (FMC) architecture. The technology has now matured to the state where production E-O framing cameras are form/fit replacing their former film counterparts. During this interim production phase, flight demonstrations and tests are continuing to prove that E-O framing produces high-quality imagery, is robust to various platforms and mission tactics, interoperable with existing and planned C3I architectures, affordable and available, and meets the war-fighters needs. This paper discusses flight test results of the CA-260 E-O framing sensor flown in the F-14A TARPS during September 1994. This demonstration provided some unique imagery permitting a comparison of low-light level, in-flight FMC-on versus FMC-off performance. A first-level comparison of the resulting imagery based upon predicted FMC performance and post- processing numerical analysis is presented. The results indicae that the patented FMC architecture performed as predicted, and that for low-light conditions resulting in limited SNR images, on-chip FMC can provide a significant image quality improvement over post- processing alternatives.
The BREVEL ground control station is a member of a family developed by Matra Cap Systemes for the unmanned air vehicle (UAV). Missions are reconnaissance, electronic warfare, strike, meteo and nuclear, biologic and nuclear sensors. After the function description, an overview of the hardware and software architecture is given. The hardware description concerns computer equipment as workstations and interfaces, and shelter integration.
A new workstation for airborne and satellite multisensor (electro-optic, infra-red, radar, film) photo interpretation has been developed. After function description, an overview of hardware and software architecture is given. Software has been designed with object oriented technology and re-using existing software primarily dedicated to civil remote sensing.
Matra Cap Systèmes developed a transportable station capable of tracking remote sensing satellites and acquiring and processing the SPOT telemetry. The full station is C- 1 30 transportable, easy to assemble and can be used for near real time data acquisition in disaster monitoring or lack of conventional station coverage. This station consists of an Antenna Subsystem set on a trailer, connected to a Shelter Subsystem housing a Processing Subsystem and the in-door control and receiving equipments of the Antenna. The Processing Subsystem offers acquisition control, satellite programming, quick processing and nominal processing of satellite data up to standard levels.
In 1994, a series of lightning-started wildfires burned over a gross area of approximately 300,000 acres of forest land on the Payette National Forest in central Idaho. This complex situation was one of the most significant fire events within the United States in 1994 and required millions of dollars and thousands of people to manage. Airborne reconnaissance using a variety of sensors, digital satellite natural resource data, global positioning satellite (GPS) system equipment, and geographic information systems (GIS) were used in support of fire suppression actions. This paper is a review of one facet of the Payette National Forest wildfire suppression effort of 1994.
Digital cameras are a recent development in electronic imaging that provide a unique capability to acquire high resolution digital imagery in near real-time. The USDA Forest Service Nationwide Forestry Applications Program has recently evaluated natural color and color infrared digital camera systems as a remote sensing tool to collect resource information. Digital cameras are well suited for small projects and complement the use of other remote sensing systems to perform environmental monitoring, sample surveys and accuracy assessments, and update geographic information systems (GIS) data bases.
This paper reviews the use of LED recording head assemblies (RHAs) for film annotation in aerial reconnaissance cameras and discusses code matrix block readers (CMBRs). Annotation of video imagery is also covered.
Multispectral imagery has been used by geoscientific communities for more than 25 years. Problems have resulted, however, in what appears to be a growing disjoint between digital imagery processing and traditional interpretation. Mainly, the problems relate to use of algorithms -- from which there are thousands to choose -- without proper consideration of interprtation foundations. The unfortunate results suggest that current instructional efforts, as dynamic and sophisticated as they may be, are geared toward processing techniques but are in absence of interpretive foundations. Current trends suggest this may be changing, but more work needs to be devoted to joining processing and interpretation in the context of data utilization. This is particularly true when one considers the products that are emerging as industry 'standards' whereby multispectral scientists must have firm understandings of linear analytical strategies if their eventual outputs are to have sufficient reliabilities and validities.
Video imaging under low light level conditions necessitates a light amplification device to overcome the inherent lack of sensitivity of today's CCD video camera. Traditionally the microchannel plate image intensifier developed for military night vision goggles has been employed in front of a CCD camera to achieve low light level video imaging. Until recently the variety of these devices has been limited to two types, the Gen II and Gen III. These offered good sensitivity, spectral response, and usable resolution. Recent developments have greatly increased the variety of spectral responses, quantum efficiencies, and greater resolution within the Gen II and Gen III types. These new intensifiers coupled to today's CCD cameras and accompanying electronic controls have resulted in low-light-level video imaging previously unattainable.
The Zeiss KS-153A aerial reconnaissance framing camera compliments satellite, mapping, and remote sensor data with imagery that is geometrically correct. KS-153A imagery is in a format for tactical 3-D mapping, targeting, and high-resolution intelligence data collection. This system is based upon rugged microprocessor technology that allows a wide variety of mission parameters. Geometrically correct horizon-to-horizon photography, multi-spectral mine detection, stand-off photography, NIRS nine high speed, and very low altitude anti-terrorist surveillance are KS-153A capabilities that have been proven in tests and actual missions. Civilian use of the KS-153A has ranged from measuring flood levels to forest infestations. These are everyday tasks for the KS-153A throughout the world. Zeiss optics have superb spectral response and resolution. Surprisingly effective haze penetration was shown on a day when the pilot himself could not see the terrain. Tests with CCD arrays have also produced outstanding results. This superb spectral response can be used for camouflage detection in wartime, or used for effective environmental control in peacetime, with its ability to detect subtle changes in the signature of vegetation, calling attention to man induced stress such as disease, drought, and pollution. One serious man-induced problem in many parts of the world deserves even more attention in these times: the locating and safe removal of mines. The KS- 153A is currently configured with four different optics. High acuity horizon-to-horizon Pentalens and Multi-spectral Lens (MUC) modules have been added to the basic KS-153A with Trilens and Telelens. This modular concept nearly meets all of today's airborne reconnaissance requirements. Modern recce programs, for example German Air Force Recce Tornado (GAF Recce), have selected the KS-153A. By simply adding additional focal length lens assemblies to an existing KS-153A configuration, the user can instantly and economically adapt this system to a different mission requirement. NdYAG laser range finder, MIL-STD- 782 data annotation, and GPS-based flight management are off-the-shelf Zeiss products that can be used with the KS-153A. Aircraft wiring or control system modifications are generally not required when the KS-153A is installed in modern aircraft, such as F-14, F-15, and F-16.
Advanced manned and unmanned Recce Systems are using electro-optical (EO) sensors for real-time image acquisition and for imagery enhancement abilities while giving up some resolution. These sensors have eliminated film and use all solid state components. But have they really? They have pushed a media problem out of the sensor into the tape recorder. When ATARS requirements were written and during the period it was being developed, there seemed to be no alternative other than using tape recorders. But today, with the proliferation of telecommunication electronics, memory chip development has accelerated. Solid state video recorders for avionics applications have become a reality, as they are for space system applications. This move to EO sensors for real-time benefits has not yet reaped the benefit of computerized mission planning with its associated solid state data transfer capability. As previously reported, Recce Cycle shortening capability has existed for several years, but is still not fully implemented. As elsewhere, application of new technologies to our historic problems can produce low risk solutions for Recce Systems. Applications of solid state memory technology to Recce Systems are presented in this paper.
Emerging trends in airborne reconnaissance equipment are establishing the need for an on- board management capability to control the equipment. The increasing flexibility of sensor, storage, and communication subsystems drives the need for a data management function to select the optimum combination of subsystem modes needed to meet an overall system operational requirement. Common flexible interfaces both on-board and off-board work with the system configuration to provide in-flight adaptation of the system configuration to meet changing data requirements. Communications system capabilities to adapt to the jamming environments also drive the need for automatic data management.
The errors introduced into reconstructed RECCE imagery by ATARS DPCM compression are compared to those introduced by the more modern DCT-based JPEG compression algorithm. For storage applications in which uncompressed sensor data is available JPEG provides better mean-square-error performance while also providing more flexibility in the selection of compressed data rates. When ATARS DPCM compression has already been performed, lossless encoding techniques may be applied to the DPCM deltas to achieve further compression without introducing additional errors. The abilities of several lossless compression algorithms including Huffman, Lempel-Ziv, Lempel-Ziv-Welch, and Rice encoding to provide this additional compression of ATARS DPCM deltas are compared. It is shown that the amount of noise in the original imagery significantly affects these comparisons.
Modeling of visible band line scan EO sensors in the flight test scenario involves the computation of ground resolved distance (GRD) based on the minimum resolvable delta- radiance (MRDR) criteria for visual perception of tribar targets on an output display device. MRDR is driven by the basic sensor parameters of noise equivalent delta-radiance (NEDR) and modulation transfer function (MTF). Therefore, the modeling process can provide a sequence of steps to reveal the linkage from the basic sensor parameters that produce NEDR and MTF, to expected lab measurements, and then finally to expected GRD performance in the flight test scenario. The MRDR vs GRD curves reveal the EO sensor sensitivity to parameters such as sun elevation, target reflectivity, visibility, and aircraft ground velocity. GRD is not a single fixed value but rather exhibits variability based on both observer statistics and flight test scenario factors.
Possible use of photoelectric structures with memory (PESM) in airborne reconnaissance is discussed. It is shown that basic PESM properties and operation abilities can be used for solving widespread image processing tasks such as image contouring, correlation, moving object selection, object dynamics determination, and so on. These and other operation modes of the PESM can be realized without any computer processing in real time. Being an integrated highly intelligent low-cost videosensor-and-processor device the PESM is capable of executing many tasks in total and local surveillance missions. PESM application for unmanned and small lifetime target observation is an especially good prospect.
This paper describes the operating characteristics of a line-scan system in both two- dimensional and stereoscopic arrangements. Details are given on algorithms that have been developed to extract range information and results are presented indicating the accuracies obtained in all three coordinate axes. The application of this stereoscopic system to both semi- autonomous and autonomous mobile platform arrangements is discussed and the advantages and disadvantages of incorporating such a system are identified.
Laser communications between high flying aircraft such as high altitude unmanned aerial vehicles offers the potential to transfer extremely high amounts of information faster and with a much smaller package than is possible using current radio frequency and microwave technologies. This can be especially important in transferring time sensitive reconnaissance information because the value of the data can deteriorate rapidly with time. BMDO has funded a number of technology efforts through the U.S. Army Space and Strategic Defense Command reducing the risks associated with laser communications. One of these efforts, at ThermoTrex Corporation in San Diego, California, is now being carried forward towards an advanced technology demonstration. The program leads to the demonstration of high data rate communications of 270 MBPS (mega bits per second) to 1.2 GBPS (giga bits per second) between high altitude aircraft and between a satellite and the ground. The laser communications terminals incorporate atomic line filter technology for background light rejection during acquisition, reactionless Roto-Lok offset cable drive gimbals for fast slewing and high accuracy pointing, and direct digital modulation of semiconductor diode lasers detected with low noise avalanche photodiodes. We present results of a 42 km, 1.2 GBPS laser communications demonstration performed at NASA/JPL Table Mountain facility in Wrightwood, Calif., a 10 km, 1.2 GBPS laser communications demonstration at NRAD in San Diego, Calif., and preliminary results of a 150 km, 1.2 GBPS laser communications demonstration between the islands of Maui and Hawaii.
Itek is developing a low-cost, small, compact, lightweight electro-optical/infrared (EO/IR) dual-band reconnaissance sensor for tactical as well as high-altitude standoff applications. This sensor is a derivative of SYERS, an operational U.S. Government system developed by Itek. The DB-110's direct viewing reflective optic sensor is inertially stabilized to provide high- resolution imagery when operating in severe vibration environments. Its visible silicon CCD focal plane with TDI (time delay integration) and high quantum efficiency indium antimonide (InSb) IR focal plane provide continuous ground coverage, spot coverage, and stereo coverage in both bands simultaneously or individually, over a wide range of operational conditions. The sensor's small size and its light weight permit pod installation or in-board installation in various high-performance aircraft and unmanned aerial vehicles. The DB-110 sensor system, currently under development, includes a reconnaissance management system (RMS) and is intended to interface with various airborne digital tape or solid state recorders and digital datalinks for real-time transmission to ground stations. The first unit is scheduled to be ready for flight testing in the spring of 1996.
The advantage of panchromatic imaging at wavelengths between 1.1 - 2.5 micrometer [short-wave infrared (SWIR)] to that of 0.5 - 1.0 micrometer [visible and near wave infrared (NWIR)] is shown by analysis and experiment in this paper. At long ranges and under low visibility conditions, the signal-to-noise ratio and image quality in the SWIR are significantly better than in the NWIR and visible spectral bands. This effect can be utilized to great advantage in airborne reconnaissance to extend the range of coverage and to improve the interpretability of the product. Such improvements apply to ground-based and space borne systems as well. Other system benefits are derived by utilizing SWIR in place of the NWIR wavelength region. Stabilization requirements can be relaxed; larger optical fabrication, alignment, environmental and boundary layer wavefront error can be tolerated; and less degradation occurs due to atmospheric turbulence and dispersion error. SWIR systems can be fabricated with some of the same optical materials available as in the NWIR and visible systems. All these effects lead to a simpler, less-expensive, and more capable imaging system that together comprise the SWIR Advantage.
The surveillance of large areas with a high resolution is limited by the CCD technology, the read-out electronics, and the high data-collection and storage rates. An imaging concept that includes an optical camera and a survey strategy is proposed for remote landing-site certification. A one-meter diameter telescope performs a diffraction-limited imaging over its field of view, imaging a 0.25 by 0.25 m area on the (Martian) surface onto a 7 by 7 micrometer pixel. With the sensor at 350 km above the 10 by 10 km site, this pushbroom imaging configuration incorporates only five sensor passes over the site. The mission time is decreased by nearly 100% over the previously proposed concept for the site certification imaging.
A push-broom imaging camera with time expansion, selected for its ability to generate images with high resolution and high radiometric signal, is described for accurate site-certification from space. The imaging system providing the high resolution imaging requires a sensor with an increased dwell time to generate a high radiometric signal. This may be accomplished by pointing the camera at each pixel for a longer interval of time than that available due to the sensor motion in the push-broom imaging configuration. This is referred to as the push-broom imaging with time expansion. The use of the camera with time expansion may be applicable to any remote sensing imaging problem that requires simultaneously high spatial resolution and a high level of radiometric signal. For surveying a Martian landing site, it is necessitated by the imaging from an autonomous orbiting sensor that's speed is determined by its orbit and the planet mass.
We propose a new technique for remote sensing: photon-counting laser mapping. Micro- channel plate detectors with crossed delay-line (MCP/CDL) readout combine high position accuracy and sub-nanosecond photon timing, at event rates of 106 detected photons per second and more. A mapping system would combine an MCP/CDL detector with a fast pulse, high repetition rate laser illuminator. The system would map solid targets with exceptional range and cross-range resolution. The resulting images would be intrinsically three- dimensional, without resorting to multiple viewing angles, so that objects of identical albedo could be discriminated. For a detector time resolution and pulse width of order 1010 seconds, the in-range resolution would be a few centimeters, allowing the discrimination of surfaces by their textures. Images could be taken at night, at illumination levels up to full moonlight, from ground, airborne, or space platforms. We discuss signal-to-noise as a function of laser flux and background level.