PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
There is still a desperate need for reconnaissance support to the tactical air and ground commander, despite increased skepticism as to the responsiveness of tactical reconnaissance. In order for tactical reconnaissance to survive, however, it must be responsive to the modern fluid and dynamic battle situation of today. There are three roles for tactical reconnaissance - tactical surveillance, conventional reconnaissance and integrated strike reconnaissance. All three of these roles are vital to the successful use of limited battlefield resources. The generation of hundreds of thousands of feet of celluloid will win neither battles nor friends for reconnaissance. The integration of all recce information and its timely dissemination to the primary user of that information is a core element in the entire tactical reconnaissance picture. Without the rapid communication of meaningful information to the tactical commander, tactical reconnaissance is meaningless. Tactical reconnaissance can generate a dynamic change in the application of and ability to fight with the forces at our disposal. It can provide invaluable time for preparation, information for the allocation of reserves, real time strike allocation and utilization and real time anti-air threat data. Tactical reconnaissance has the potential and the equipment to be a force multiplier.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Air Force will soon initiate concept definition for a penetrating tactical reconnaissance system to fulfill the combat information needs of tactical commanders. Such a development has been considered during the past ten years, but never begun because of deficiencies, both real and perceived, associated with traditional tactical reconnaissance systems. Technologies for this system need to be exploited which will provide near-real-time, all weather information from a deployable and survivable platform and which minimize supportability problems.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
An important role for reconnaissance is the location and identification of targets in real time. Current technology has been compartmented into sensors, automatic target recognizers, data links, ground exploitation and finally dissemination. In the days of bring home film recce, this segmentation of functions was appropriate. With the current emphasis on real time decision making from outputs of high resolution sensors this thinking has to be re-analyzed. A total systems approach to data management must be employed using the constraints imposed by technology as well as the atmosphere, survivable flight profiles, and the human workload. This paper will analyze the target acquisition through exploitation tasks and discuss the current advanced development technology that are applicable. A philosophy of processing data to get information as early as possible in the data handling chain is examined in the context of ground exploitation and dissemination needs. Examples of how the various real time sensors (screeners and processors), jam resistant data links and near real time ground data handling systems fit into this scenario are discussed. Specific DoD programs will be used to illustrate the credibility of this integrated approach.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The reconnaissance systems of the year 2000 and beyond may be merely an extension of current technology or may utilize bold new technology and concepts still in the embryonic stages. The five basic reconnaissance mission stages: collection, processing, interpretation, reporting, and dissemination, are reviewed in terms of the potential application of new and emerging technology such as high density multispectral focal plane arrays, new radar techniques, VLSI/VHSIC computational resources, artificial intelligence, multisensor integration, pattern and target recognition, image compression, advanced display and targeting techniques, and even new fields not thought of as exact sciences today. The application of these technologies is viewed in the context of the reconnaissance missions: targeting, damage assessment, order of battle assessment, terrain evaluation and planning. The traditional neeos for varying levels of detail and timeliness of reconnaissance data are shown to be largely removed by the use of the most advanced and highest development risk systems. Lower development risk systems show excellent capabilities with the potential for high capability at low cost. New fields may totally change or even eliminate reconnaissance as we know it today.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Future non-nuclear warfare scenarios emphasize rapidly developing combat situations where 24-hour adverse weather operation will be mandatory. Infrared sensors will continue to play a major role. Airborne intelligence is divided into three major types, Reconnaissance, Surveillance and Strike/Strike assessment. These tasks provide a continuing important role for infrared line scan systems (IRLS) since a single sensor can satisfy requirements of each intelligence type. Near term (to 1985), mid-term (to 1990) and far-term (to 1995 and beyond) development trends are briefly surveyed to show how IRLS systems are evolving to meet the new requirements of lower altitude, higher v/h, oblique viewing, rapid film interpretation, bandwidth compression, data linking, target screening and synergistic combinations with laser scanners and microwave scanners. Technology trends are noted and it is shown how these may be exploited in new sensor systems.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper discusses RADC/USAF Research and Development activities in the areas of image processing and target identification relating identification and location of ground targets in a tactical scenario. The primary objective is to reduce the reconnaissance cycle from days/hours to minutes and seconds commensurate with the near real time (NRT) intelligence requirements of the tactical forces. This NRT intelligence is essential to support strike of increasingly mobile enemy weapon systems. In currently fielded systems exploitation of film based reconnaissance is extremely slow, greatly lagging collection rates. There are three essential elements to a NRT tactical intelligence system. They are NRT imagery collection, NRT air to ground image data link and NRT imagery exploitation. The thrust of this paper will be to discuss technology required to support the later, NRT imagery exploitation. Technology intensive efforts are categorized under each of the exploitation elements (target detection, identification and precision location). R&D in the area of target detection consists of exploratory and advanced development work units in automated target correlation, automatic change detection and pipeline image processing for screening probable target areas. Target identification R&D to be presented includes automatic techniques for pattern recognition as well as semi-automated techniques for aiding an analyst by correlating various sensor and intelligence inputs to permit target identification. Near real time precision target location techniques will include techniques for locating imagery targets in a predefined precision photographic data base as well as techniques for performing location simultaneously with target identification. Summary and conclusions for this presentation discuss commonality aspects of a digital image exploitation system relating to a "universal" image exploitation system and the potential for its use in NATO. Potential areas for developing cooperative R&D programs in support of this universal system objective will be identified.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper presents the results of a study effort to define and specify an architecture for the purpose of ingesting, storing, exploiting, and verifying reconnaissance imagery. The defined architecture consists of four functional modules: Sensor Input Module (SIM), Storage and Retrieval Module (S/RM), Real-Time Processing Module (RTPM), and Near Real-Time Exploitation Module (NRTEM). These functional modules and the associated hardware subsystems will perform the ingest, handling, and display of imagery from the required sensor data types.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Reconnaissance systems and sensors have gone well beyond the conventional concept of flying film camera over the terrain. They now utilize some very unusual techniques of high performance electronically generated imagery with correspondingly large demands being placed on the image generators to produce the replica of the terrain sensed by these systems. This paper will describe highlights of recent high performance image generators for reconnaissance applications and how some of these requirements have been satisfied.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In its continuing development of electro-optical systems for reconnaissance and surveillance, Itek Optical Systems has produced a rugged, small, and inexpensive sensor for a variety of applications, including throw-away missions. The basic sensing element of the Model 2KL sensor is a derivative of the 2,048 x 96-element time delay and integrate (TDI) charge coupled device (CCD) described in SPIE Volume 175. The sensing device is designed into a small package that features high-resolution digital processing, flexible output data format, and compatibility with a wide range of interchangeable objective lenses.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The increased need for smaller and more versatile aerial reconnaissance equipment has led to the use of the KS-116 camera in the RF-4B aircraft, and the KA-95 in the RIF-5E. Both cameras use a 12-inch fl, f/4.0 lens and a 5-inch film transport. The main difference between these two cameras is their configuration. This paper describes the features of the two cameras, including: selectable scan angle, forward motion compensation (FMC), roll stabilization, exposure control, unique packaging differences and focus correction. The inherent packaging flexibility of the prism pan camera and the availability of key modules have led to multiple configurations of which the KS-116 and KA-95 are two examples.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The performance expected from Long Range Oblique Photography (LOROP) sensors will continue to increase in the years ahead. Diffraction-limited optics and advanced subsystems are required to provide maximum resolution at the high altitude and long standoff distances which characterize this mission. This paper describes the KS-146A LOROP camera system which has been developed for this specialized role. Optimized for use with high-definition EK 3412 and 3414 films, the camera uses a seven-element, 1676.4-mm (66-inch) fl, f/5.6 lens in conjunction with a two-axis, gyro-stabilized scan head and passive isolation system. These components complement one another and attenuate both high and low frequency vibration inputs which can produce image-degrading smear. A closed-loop autofocus system is incorporated to compensate for focus shift resulting from temperature and pressure changes, while a self-contained thermal system maintains lens presoak temperature to eliminate thermal gradients. Although primarily intended for use in a modified centerline fuel pod, modular construction and microprocessor control allow alternative configurations, aircraft interfaces and the incorporation of E-O capability. Details of the camera development, installation within the pod and static and dynamic performance are presented. Predicted airborne performance is also analyzed. The features of the KS-146A result in a flexible system designed for the environment in which LOROP cameras must operate.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Fourier transform techniques have been applied to measure image velocity and position. These patented techniques have been incorporated into airborne image motion compensation sensors, helicopter-borne image stabilizers, and TV trackers for tank and helicopter fire control. Microradian tracking accuracy at less than 5-percent contrast has been demonstrated with simpler and less costly hardware than required with correlation techniques.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
A simple, low-cost video bandwidth compressor has been developed. The design provides for interface to both interlaced and non-interlaced TV and FLIR sensors. Data rate reduction is obtained through the use of several methods including the hybrid DCT/DPCM algorithm, frame rate reduction, resolution reduction, and image truncation. The overall data rate reduction is up to 1000:1 and the picture quality is in excess of 36db at 2 bits per pixel.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper discusses the problems of exploiting imagery, suggests the Multi-Imagery Lxploitation System (MIES) concept as a solution, lists the external technologies that are needed for MIES development, and casts MIES into an operational context via description the strike/attack mission cycle. The paper focuses on determining requirements for target detection, classification, identification and location. The approach establishes a framework for the interplay of the study elements involved. Key to the definition of requirements is the selection of representative target groups from a scenario governed by tempo of operations and target distribution which could be expected in a NATO/Warsaw Pact conflict in Central Europe. The operational utility of NIES is then described in terms of battle management functions, air operations and targeting. Implications are then drawn with respect to desirable directions for MIES development.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In the past ten years there have been very promising advances in the state-of-the-art in aerial reconnaisance both in the airborne systems and in ground data handling. However, the physical world restraints of resolution still make the conventional film camera the prime sensor in aerial reconnaissance. Ground resolution and working scale required for specific items of information are fixed. The electro-optical system, IR system, radar system, or the photographic system must meet the image data requirements in order to be useful. Detector physical dimensions and phosphor minimum grain size limit the system resolution element size. Information theory describes the limits on resolution elements required for various levels of detection, recognition, and identification. The achieved system performance includes all factors which affect resolution when the system is flown. Information theory accounts for the difference between in-flight resolution and laboratory resolution. Computer availability has facilitated polar plot depictions of airborne system performance that permits lens aberration effects and window thermal gradients to be studied as well as system resolution performance in terms of system parameters. The very large information content and the immediate interpretability inherent in film sensors make conventional cameras the prime clear daylight sensor for now and for the projected next ten year period.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In a previous paper "Effects of Flow Field Aerodynamics on Imaging Sensor Resolution" - SPIE 1977, it was shown that various flowfield phenomena can degrade sensor performance by causing air density variations. These phenomena were identified (Figure 1) and the degradation quantified (Figure 2). Simple aerodynamic shapes and empirical data were used to establish the relative importance of each type of flow. It was concluded that the mainstream flow field and the separated boundary layer are the major influences on sensor performance. Separated flow regions can be eliminated through proper vehicle design (elimination of abrupt mold line changes). This leaves mainstream flow as the critical phenomena--especially for large aperture sensors (Figure 2). Computer analysis techniques are now available that can accurately model mainstream flow. The program called AFTEND computes the vehicle flow field in three dimensions from freestream to the vehicle surface including the turbulent boundary layer. The location and strength of local shocks also are identified. Details of this analytic technique will be available later this year in a paper authored by Ray Cosner of McDonnell Aircraft Company (MCAIR). (Reference 1). The following paragraphs apply this program and present results for a typical reconnaissance aircraft.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Optical and digital approaches to 2-D image processors each have their unique advantages and limitations. Hybrid systems have been proposed to utilize the best of both. An equally important manner in which the two approaches can complement each other is through a digital simulation of the optical system. This paper reviews some of the problems of matched filter correlator applications and illustrates how valid models for simulations can be generated. Comparisons of the digital simulation to the analog approach are made for detection problems using specific targets and terrains. The relative merits of each technique as well as ways in which the digital simulation may be used to assist in the optimal design of the analog system are discussed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Experiments were conducted using image dynamics which simulated a ground-stabilized narrow FOV sensor to examine the effects of IR "Hot Spots" on target acquisition performance. Subjects were required to detect and recognize vehicle targets situated in backgrounds that varied in complexity. Displayed target signatures were representative of those associated with FLIR or TV imagery. Several performance and stimulus imagery measurements were recorded and preliminary regression analyses were performed on the results. Luminance distributions within the vehicle target and the background were important cues for both detection and recognition.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The success of an air-born reconnaissance mission is due in particular to the shortening of the time required to perform each phase.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The purpose of this paper is to present a short definition of a MIL-STD-782 code block, to introduce the advantage of using a code block, to discuss the generation of a code block, and to identify two problems that have limited the wide-spread use of code block data annotation: the rate of information retrieval through-put, and the error rate associated with this information retrieval. Finally, an automatic code block reader, useful in alleviating these two problems is identified. With the introduction of the automatic reader, the use of the code block, long held in its infancy, can grow to full maturity.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper examines depth resolution and convergence limitations of very small through very large convergence angles. Conclusions are derived from a combination of theory and parametric stereo photographic and viewing experiments. The relationships of the limited bandwidth of about 25 bits per second through the optic nerve and the short term image data buildup, storage, and congruency difference stereo processing by the brain are discussed. This explains one-eyed stereo among other things. It is theorized, and experimentally supported, that maximum stereo convergence is limited by retinal disparity tolerance, and this is limited over that area of the fovea where a single neuron (wire) connects each cone (optical sensor) to the brain. Formulas are derived and presented which apply this work to stereo photography and its viewing.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The comparator is a device facilitating both the delineation of features of an object via some indicating means and determination of their relative spatial coordinates from a scale. The concept is fundamental to spatial measurement and is embodied in a broad spectrum of instruments ranging from the simple shop micrometer to the sophisticated photogrammetric plotters. All compare object to scale and can be discussed in terms of mapping scale space into object space. The absolute comparator is the ideal; the mapping is one-to-one, undistorted and uncorrupted, ie. the mapping is error-free. In the real-world, the mapping is physically embodied as the mechanical linkage between the viewing and measuring systems, so that residual bearing play and runout, plus the plastic flexibility of linkage elements result in motion-error which manifests as mapping errors. Motion-error is often the dominant error component in coordinate measurements and presents a major hurdle in state-of-the-art comparator technology. There are three approaches to the motion-error problem: reduction, compensation and immunization. The first is a matter of mechanical refinement and finesse, while the second relies upon brute-force calibration and/or measurement over-determination. The third approach involves a concept so simple it has been essentially overlooked despite its far-reaching implications. The theoretical considerations and evolution of an experimental prototype comparator enjoying motion-error immunity are presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The increasingly stringent performance requirements for airborne sensor systems will eventually exceed the capability of conventional materials and fabrication techniques to provide satisfactory design solutions. This will occur, historically, in the areas of inertial properties, stiffness and dimensional stability. A possible solution for enhancing performance in these areas is the appropriate application of graphite/epoxy composite materials; however, certain principles of application must be observed in its use in order to achieve the optimum result. In this paper, the practical aspects of utilizing graphite/epoxy in an airborne environment are discussed and the incorporation of these principles in practical structures is illustrated with respect to optical mirror substructure, telescope and camera structure, instrument mounting platforms and ultra-lightweight radiometer reflectors.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Several methods are available for mounting cameras on strike aircraft for bomb strike damage assessment. A comparative analysis of each method may be made by relating ground coverage in mathematical form to various conditions of flight. Flight paths and bomb trajectories are also expressed mathematically. Comparison of methods is obtained by synthe sis of these mathematical relations. The A-10 aircraft is used as a model for examples in this paper. The general comparison method developed uses a simple paper and pencil analysis to obtain the mathematical relations, and transparent plastic overlays to perform the synthesis. A graphic solution results where conditions can be easily altered to determine their effects on the final result. The method was developed to obtain fast estimates of results for a variety of given conditions, but it could also be used as the basis for a more sophisticated computer-based analytical package for predicting target coverage.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The vast quantities and high generation rates of tactical imagery require very efficient data compression in order to conserve precious bandwidth for transmission and to limit the required storage volume for archiving. This paper describes the results of ef-ficiency and image quality comparisons for several transform image coding techniques. Specifically, the research effort was focused on developing the Singular Value Decomposition (SVD) as an approach to image compression. Detailed comparisons were made to the two-dimensional cosine transform, Hadamard transform and Karhunen-Loeve techniques.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.