A Naval career, especially one that includes a Washington tour, conditions one to "shift gears" on a moment's notice. With experience, you become increasingly confident that you will rarely be caught off guard. Yetâ€"although I have been aware of this engagement for some time-I must admit to a certain degree of amazement at being here, addressing so large a group, brought together by a mutual interest in aerial reconnaissance. Perhaps, the vacuity which has beset this important field is coming to an end.
The F-5 family of tactical fighters is presently being utilized by 20 countries throughout the world as lightweight, highly maneuverable fighters. In many cases, these user countries must have the capability of performing multiple missions, including reconnaissance with the same aircraft without any appreciable penalty in performance, maintainability and cost. The purpose of this paper is to present design development information on reconnaissance capabilities of the F-5 family of tactical aircraft. Comments are included on external recon pod versus internal recon installations and their effect on the F-5 aircraft with its very slender fuselage and small cross sectional area.
In pursuing the development of airborne tactical reconnaissance pods for Navy programs the main areas addressed were the tactical operational requirements and limitations imposed by the aircraft interface. These requirements and limitations established the design philosophy and sensor selection for these programs. The paper describes how various sensors, in concert with avionic systems, can be utilized by existing aircraft to collect multisensor reconnaissance information in an operational environment. The impact of design tradeoffs when interfacing an existing aircraft as opposed to developing a new sensor platform is discussed. Flight test results of a multisensor reconnaissance pod system installed on a tactical aircraft are presented. Problem areas relating to sensors and pods are presented as well as approaches to overcome these problems.
Grumman efforts in the design and study of tactical reconnaissance systems, and particularly studies of the application of pods for such reconnaissance missions, have been conducted over the past several years, as shown in Figure 1. Most of this effort has been done with corporate funds supplemented by some funded studies from the U. S. Navy.
Remotely Piloted Vehicles (RPV's) are versatile aerial platforms which can be piloted by radio link from a remote control point. The latest model features a removable modular nose designed to carry a variety of payloads for reconnaissance, electronic warfare and strike missions. This paper describes the interface of three sensor systems to RPV's -the KS-120 camera, the Perkin-Elmer KA-98 prototype laser line scanner and the prototype Philco Ford laser target designator. The interface to the modular nose YBGM-34C multi-mission RPV is discussed along with the interfaces to the BGM-34B and AQM-34M RPV's. A brief history of reconnaissance RPV's is given.
An overall system configuration has been developed which effectively integrates the Navy Optical Reconnaissance Pod System into the single place A-7E attack aircraft. The objective of this effort was to provide a comprehensive reconnaissance capability yet minimize the change impact on the aircraft. The reconnaissance pod is wing mounted and provides the aircraft with the capability of obtaining high resolution aerial photographs over a broad range of aircraft speed and altitude conditions. Existing aircraft subsystems such as the central digital computer, the head up display, and associated cockpit controls are used to provide both manual and automatic modes of sensor operation. A TV system is used in conjunction with the head up display to provide complete viewfinder capability with display of steering commands, target anticipation cues, and sensor status to the pilot for workload reduction. Flight performance data to fulfill sensor operation and film annotation requirements is derived from the navigation system and is transmitted to the pod via serial digital data channel. System operation utilizes currently established cockpit control and display philosophy and techniques.
The non-uniform and fluctuating air density field surrounding a high performance aircraft can significantly affect the performance of visual infrared imaging sensors. This paper addresses each of the classic types of flow field phenomena and develops theory for incoming optical wavefronts. Parametrics are presented for several flight conditions, and guidelines are discussed for minimizing sensor performance degradation due to these effects.
High resolution cameras rarely perform as expected in aircraft, often due to vibration problems. Proper understanding and use of classical dynamics for the whole system from optical axis to aircraft is necessary to quiet vibration enough for good photography. Shock mounting is different from quieting. The statement, "We get better results without isolators," is actually, "We get better results without shock mounts." Linear spring-mass-viscously damped systems are best, and if used, their performance can be easily predicted as shown. Passive vibration isolation systems will handle most problems. (Consider the automobile, the most successful example. ) What is needed is a very low frequency suspension with good viscous damping in all directions that isolates and damps all six degrees of freedom, three rotational and three linear.
Tactical aerial photographic reconnaissance requires the acquisition of high acuity photography under adverse environmental conditions. In cameras with medium to long focal length lenses, large focus errors resulting from uncontrolled pressure and temperature variations have been experienced with loss of image sharpness and resolution. An autofocusing system has been developed at CAI that uses a photoconductive cell to detect image contrast as a measure of "out-of-focus. " The system was built into an 18-inch focal length f/4 panoramic camera where it demonstrated the ability to maintain the image focus within .001 inch of the film plane for a large range of varying temperatures and pressures. Inflight test data showed improved photointerpretation; ground target detection was increased by 30 percent. This automatic focusing method results in significant improvements in sensor acuity through near optimum focus under all conditions of transient wavefront quality of the optics and the imaging environment.
Technological and operational improvements have steadily reduced the effective reaction time in reconnaissance scenarios, making real-time capability virtually mandatory in many reconnaissance missions. A realistic assessment of functional advantages and limitations provides perspective to promote matching of real-time imager characteristics to applications. An overview in the context of operational and utilizational philosophies discusses the pertinent system factors such as mission, performance, sensors, information transmission, image processing, displays and human factors. Additional aspects such as the man-in-the-loop problem and onboard data selection and recording are considered in terms of effectiveness, complexity and cost.
A basic TV Viewfinder for use with conventional aerial reconnaissance systems is described. At the heart of the system is a CCD solid state sensor. Other requisite and optional camera functions together with a modular electronics approach are outlined. In the system de-sign, as much attention has been given to the human limitations aspects as to the technical details of the camera sensing system and the display. Vehicle V/H rates play a major role in determining the extent of information that is intelligible to the viewer in this type of real time electro-optical system. Depression angle considerations likewise limit or extend the operational capabilities of the electro-optical/ human system complex. Photometric properties of the sensor in conjunction with geographical locations determine the operational day. Additionally, an optical low light capability can be selected in the viewfinder camera where the basic zoom optics can be replaced with a fixed focal length large aperture lens. At the same time, the effective exposure time (per frame) can be increased over that used under normal daylight conditions. A comparison has been made between the daylight and low light operation of the viewfinder and a typical film camera.
A nomogram has been developed which permits long range television system trade-offs to be performed quickly and easily. All important scenario, optical, and sensor parameters have been included. The reduction in image contrast and faceplat illuminance due to atmospheric affects and the relationship between the solar spectrum and the photometric units used by tube manufacturers have been accounted for. All these relationships have not been included in existing nomograms.
This paper discusses two infrared sensors currently used by the Navy in the far infrared spectrum. Spectral region selection, description of the sensors, system operational concepts, sensor application, sensor selection and potential future sensor capabilities are described.
Synthetic aperture radars are now operational in both military and commercial applications. They provide detailed images of terrain, regardless of cloud cover and natural illumination, and in an electromagnetic spectrum not exploited by other high-resolution sensors. Because synthetic aperture radar is a microwave range-measuring sensor, the imagery is basically a two-dimensional record of the microwave backscattering strength of the terrain surface and objects placed on it. The two dimensions are distance along and from the flightpath of the aircraft carrying the radar. Elevation displacements parallel those of optics but are different in magnitude and direction, particularly from objects with arbitrarily oriented surfaces. Longer wavelengths than those used in optical and infrared sensors cause more specular reflection effects in radar imagery. Exploitation begins by understanding these unique image characteristics and applying them to recognizing or inferring the nature of the objects and surfaces which scattered or reflected the radar illuminating energy back to the sensor. Special techniques used in radar image exploitation include detection theory, digital and optical image formation, wide dynamic range displays through color and hologram viewing, shadow analysis, stereo viewing, dichotomous keys, change detection, synergistic fusion with other sensor data, and the application of earth sciences fundamentals.
The continuous need for aerial reconnaissance in its many forms has engendered support for studies to screen, process, and interpret the reconnaissance information. Some of these investigations deal with coherent optical signal processing approaches. In this paper, the authors describe the Grumman developed Optical Matched Filter Image Correlation (OMFIC) system. This automatic high speed system operates by processing photographic or other forms of imagery through multiple holographic memories of high density and effecting correlation between the input and stored imagery. The sensitivities of the matched filter to various target parameters were determined in this study, and they form, in general, the basis for establishing the memory. Three typical terrain models were processed for correlation and signal-to-clutter ratio. Estimates are given regarding processing speeds and their relationship to photointerpretation processing time.
Microwave radiometers are the only sensors capable of passive, adverse weather, day-night operation. With operating principles similar to infrared (IR) sensors, microwave radiometers can produce imagery superior to that of high resolution radars on an equal resolution basis. They also provide a low cost, small alternative to radar which can give pod systems, remotely piloted vehicles (RPV), and aircraft an adverse weather capability where the size, cost and complexity of a radar system producing the same quality of imagery could not be accommodated. The phenomenology associated with microwave radiometric (MICRAD) imagery provides a different measurement of target characteristics and thus, increased target identification capability when utilized in multisensor systems. However, MICRAD systems of acceptable performance have only recently been developed.
Webster defines technique as "the manner of performance with respect to mechanical features or formal requirements." Unconventional sensors have impacted both the mechanical data handling of imagery and the formal requirements for exploiting each nonconventional sensor, for military purposes. New exploitation techniques incorporating digital processing, displays, and associated hardware development have demonstrated unconventional sensor information content comparable to what can be derived by the interpreter with the conventional light table and viewing optics. The optimal mix of optical and digital processing in the future will be governed by two major factors 1) the best method of extracting information from a given imaging sensor, i.e. radar, infrared or electro-optical, which in turn, is based upon ease of manipulation, cost and practicality and 2) the requirement for information in near-real-time.
FLIR systems have evolved since 1964 to the point that they represent a proven extension of man's sensory perception. Understanding of the technologies relevant to this sensory extension has also developed to an advanced point. For each generic application of thermal imaging there are peculiar tradeoffs relative to detectors, image formation, signal processing, and display. Civil aviation, law enforcement and resource management, among others, suggest applications of modern FLIR equipment. Improved performance and increased reliability have made feasible new applications of thermal imaging. Cost, however, remains to be reduced before any of these new systems will find wide acceptance. Techniques do exist to significantly reduce FLIR cost without sacrificing performance. All of this will be discussed and FLIR imagery will be shown to validate many points.
Microwave radiometers are the only sensors capable of adverse weather, day-night operation. Operating on principles similar to those of infrared. (IR) sensors, microwave radiometers can produce imagery superior to that of high resolution radars on an equal resolution basis due to the absence of clutter an. scintillation. They also represent a low cost, small alternative to high resolution radars that can provide pod systems, remotely piloted vehicles, and aircraft with adverse weather capability where the size, cost and complexity of a radar system providing the same quality of imagery, cannot be accommodated. In addition, the phenomenology associated with microwave radiometric (MICRAD) imaging lies between. that of IR and radar. Thus, responding to different material and target characteristics, it provides new dimensions to the interpretation of multisensor systems and the increased target identification capability.
Itek recently completed the first of a new 66-inch f/4 camera system for long range oblique photography a version of Itek's earlier 66-inch f/4 cameras. The operating range is from 15 to 60 miles or more. The KA-102A is designed to be carried in a pod on a fighter aircraft. The pod assembly is 166 inches long with a 22-inch diameter over most of its body weight is 1,550 pounds. The lens covers a 41/2- by 41/2-inch format (3.9 degrees side to side); the system can use any one of a number of five-inch format frame camera backs. The camera covers 22 degrees, on either side of the aircraft in six steps, with depression angles of 3 degrees to 25 degrees. The camera is aimed remotely from the cockpit, and is capable of being programmed (by a simple turn of a switch on the control panel) to provide multiple frame width coverage. Therefore, when operating in the triple-width mode the camera will cover a swath of over 11 degrees by several hundred miles. The camera carries up to 1,000 feet of 2.5 mil base film (type 3414 or 3400) and has built-in temperature control, AEC, and focus control.
This paper presents a brief functional description of an advanced development RF-4C reconnaissance pod system. This system employs a high resolution return beam vidicon in a "snapshot"TV mode to acquire and present, via video data link, high quality near real time imagery at a remote ground site. Flight test measurements, instrumentation, and system performance are presented in the form of analytical results and selected imagery. The effectiveness of system operating modes in compensating for weather related image degradation is discussed. Sensor test bed motion and temperature results of the environmental instrumentation are presented as a function of flight profile. Importance of additional pod sway braces and proper auto-pilot operation are demonstrated. In spite of an expected shift of pod resonance from the design, and larger airframe imparted motions than indicated from previous data, no significant reduction in sensor resolving power was caused by payload motion.
The long wavelength radar technology, as used in IMFRAD, was originally developed as a means of performing concealed target detection of tactical targets. Recent investigations show that the technology has applications over a wide spectrum of target cueing, strike control and wide area surveillance problems. The capability of the sensor to see through foliage enhances to ability to perform these functions. Requirements for a system to fulfill one such application will be discussed.
The Air Force Avionics Laboratory has developed a night photo system designated as the KS-126A. The KS-126A was designed for the weight-power and size constraints associated with RPV aircraft. The system consists of a pulsed illuminator synchronized to a single 70mm camera. The camera is unique in two respects; (1) one tube; (2) the camera makes use of a rotating carousel or turntable having three prisms, each one providing a 40" x 40" view of the ground. The system was delivered in an RF4C centerline pod so that flight testing could be effectively accomplished by a controllable high performance air-craft. Results of experiments to simulate the RPV flight environment in a ground dynamic analyzer and the RF4C centerline pod will be discussed. Also the flight test results and KS-l26/BGM-34C vehicle interface problems will be addressed.
This paper describes the recent results of flight tests of the KS-128A Night Photo System. The KS-128A is an electronic flash equipped, low altitude camera system designed to replace the pyrotechnic cartridges currently used on the RF-4C aircraft. The system consists of two KS-87B cameras equipped with 6 inch f/1.5, near infrared (NIR) lenses and a centerline pod mounted (NIR) electronic flash subsystem. The performance of the system in its initial flight tests and evaluation was comparable to that of the daylight cameras on the RF-4C. Subsequent to the initial evaluation, one camera was modified by the addition of a small NIR light in front of the lens to produce "Concurrent Photo Amplification (CPA)." Recent flight tests indicate that CPA doubles the performance of the system at low energy levels. These results show that a CPA equipped night photo system with the performance of the KS-128A would need an illuminator only one-half the size of the present subsystem. Similarly, this reduction in illuminator requirements makes conventional flash photography from relatively light aircraft for civilian surveillance purposes practical. Examples of test photography will be presented.
An Artillery Launched TV Target Location System will provide real time observation of distant battlefield targets using a parachute deployed all solid state TV camera and RF Link. The camera uses a Fairchild developed Charge Coupled Device (CCD) imager. The system makes use of the major components of the M485-A2 illuminating round for the 155 mm gun, in which the illuminant is replaced with a ballistically matched package comprising the CCD TV camera, battery, RF transmitter, and antenna. The TV pictures are displayed in real time and simultaneously recorded on a video tape recorder. The use of a CCD TV camera makes it practical, for the first time, for this type of system to survive set-back acceleration when the shell is fired.
The KA-99 ( ) Panoramic Camera was developed in 1974 under contract to the U.S. Navy to provide a low/medium altitude aerial reconnaissance camera to meet the need for a versatile low-cost high-performance tactical photographic system. It provides full horizon-to-horizon panoramic photography over a broad V/H mission envelope required for current high performance aircraft. It was designed specifically for installation in a pod but is also compatible with conventional tactical aircraft installations. It provides high acuity imagery on five inch wide aerial film while operating at altitudes between 500 feet and 12,000 feet. A proto-type camera has been successfully flight tested in a pod developed by NADC, Warminster, Pennsylvania.
To meet Army requirements for a more efficient method of obtaining terrain and environ-mental data, the U. S. Army Engineer Topographic Laboratories (ETL) designed and fabricated a unique multispectral aerial camera. This experimental camera employs a single lens, a beam dispersing system, and dichroic filters to divide the reflected light energy into four broad spectral bands: Blue, green, red, and near infrared. The resulting four spectrally separated images can be reconstituted into a number of displays using a four-channel, optical, additive color viewer. Over the past three years, the camera has been flown over the ETL remote sensor test areas to compare this system with conventional types of aerial photography for acquisition of MGI. During the test, multispectral imagery was obtained simultaneously with color, color IR, color negative, and panchromatic emulsions. In addition, ground data was also acquired during the overflight to ascertain the validity of imagery derived data. The initial results of the comparison indicate that the multispectral imagery provides the image interpreter with a highly flexible imaging system and a number of advantages over conventional emulsions. The advantages include better determination of drainage channels, better shoreline delineation, and better recognition of different vegetation species.
The increasing use of pods to add supplemental reconnaissance capability to attack or other aircraft imposes, new constraints on sensor design. Available space, never great even in dedicated reconnaissance aircraft, is extremely limited. Environmental ranges and vibration levels may be more hostile in pods. Ease of maintenance, an essential requirement in any system, can be complicated in pod installations by the lack of space and a less than convenient location. To meet these new requirements, CAI developed the KA-93, a self-contained, extremely compact 24-inch focal length, prismscanning panoramic camera offering high resolution on 5-inch format film, with sector scan, forward motion compensation, autofocus, and roll stabilization. Scan prisms are fabricated using a newly developed bonding technique that eliminates the temperature sensitivity traditionally associated with L.irism panor.Lmic cameras. This feature, coupled with a fully automatic focusing mechanism ensures maximum recorded resolution under all conditions. This camera offers for the first time a reconnaissance capability previously unavailable. In addition to having been flown in remote-piloted vehicles (EPV's)pods and dedicated manned aircraft, the camera has demonstrated high reliability and ease of maintenance under day-to-day field/flight conditions.
A computer program system has been developed for predicting performance of a CCD strip mode aerial reconnaissance camera. Both static and dynamic performance results are available from the computer output. The input data include optical characteristics, defocus effects, solid state sensor geometry, image motion due to the basic forward vehicle rate, forward and side motion compensation errors, and camera forward and side oblique viewing angles. Output data from the computer include; image coordinates, slant range, image velocities at the image plane in two axes (due to forward motion and the vehicle attitudes and motions), electronic data rates, and resolution performance (both at the image plane and equivalent ground resolution). An application of the program to sample reconnaissance problems is given together with typical results and analyses.