The paper reviews future Intelligence, Surveillance, Target Acquisition and Reconnaissance (ISTAR) concepts being explored for the Canadian Army. These concepts build upon the realities of battle-space digitisation and the opportunities of a network-centric approach. The paper presents experimentation underway to flesh out and validate these concepts. The vision of the future ISTAR capability is driven by the information requirements to support a commander's decision- making in attaining mission effectiveness. The system environment is characterized by multi-user interaction in which the exchange of real-time information and collaborative work is the norm. This common environment is well suited to support the multifunctional complexity required by the different actors within the system and the diversity of the missions that they serve.
As part of its technology development program for the Army's Future Combat System (FCS), the Defense Advanced Research Projects Agency (DARPA) has been working to develop the enabling communications technology needed to revolutionize the Army's future land force into a network centric force capable of operation at a level of synchronization, mobility, and force effectiveness which has been heretofore unachievable. Key to this network centric operation will be a scalable suite of multiband wireless, mobile adhoc networking devices which operate with directional antennas to provide significant improvements in capacity, Anti-jam, and Low Probability of Detection (LPD). This paper reviews the underlying motivation for this technology development and summarizes the numerous technology investments being made by DARPA under their FCS Communications technology program.
The US Army's Future Combat Systems (FCS) and Objective Force will rely heavily on the use of unattended sensor networks to detect, locate and identify enemy targets in order to survive with less armor protection on the future battlefield. Successful implementation of these critical communication networks will require the collection of the sensor data, processing and collating it with available intelligence, then transporting it in a format conducive to make quick and accurate command decisions based on the latest tactical situational awareness. The networked communications must support both static deployed and mobile ground and air robotic sensors with secure, stealthy, and jam resistant links for sensor fusion and command and control. It is envisioned for broadest application that sensor networks can be deployed in a two-tiered architecture. The architecture includes a lower sensor sub- layer consisting of mixes of acoustic, magnetic and seismic detectors and an upper sub-layer consisting of infrared or visual imagers. The upper sub-layer can be cued by the lower sub-layer and provides a gateway link to the higher echelon tactical maneuver layer networks such as the Tactical Internet. The sensor deployments, networking constraints and reach back distances to Command and Control (C2) nodes will be mission scenario specific, however, the architecture will also apply to tactical unattended sensor, munition and robotic application. Technologies from the Army Research Laboratory, Defense Research Projects Agency (DARPA), and commercial will be leveraged for this effort.
The network-centric 'system-of-systems' concept popular in current defense programs has been viewed from a very functional perspective. However, the heart of such a system is going to be an embedded software infrastructure of unprecedented complexity, and the technology for developing and testing this software needs as much if not more immediate attention than the concept of operations for the envisioned applications. Such an embedded software system will need to be infinitely scalable, modular, verifiable, and distributed, yet satisfy the myriad hard real-time performance constraints imposed by each of perhaps many different device types and service demands. It is suggested here that the only path to a robust design methodology for such systems is with model-based design. Model-based embedded system design is the focus of the Model-Based Integration of Embedded Software (MoBIES) program, currently underway at the Defense Advanced Research Projects Agency (DARPA), managed by the author. This paper will motivate the model-based approach to large-scale embedded software design and explain how projects funded under MoBIES are contributing to the development of interoperable model-based design tool components. An application for such technology is provided in the context of digital flight control systems for aggressive aircraft maneuvers, which is the subject of another DARPA sponsored program, Software-Enabled Control (SEC).
Spread spectrum communication techniques have been recognized as a viable method to gain an advantage in interference environments. Many military-oriented systems have been initiated, and some civil systems have been attempted. Spread spectrum allows the ability to hide the signal of interest below or in the noise floor, so as not to be detected. A spread spectrum system is one in which the transmitted signal is spread over a wide frequency band, much wider, in fact, than the minimum bandwidth required to transmit the information being sent. We at Army Research Lab (ARL) are proposing using the same technique on the Internet with port hopping. The information would be transmitted in data packets over multiple ports. The port used would vary per packet or per session basis. This port hopping gives you and the recipients the ability to take datagram's and spread them out over a multitude of ports. This will hide information among the Internet noise. This will allow trusted communications between the transmitter and receiver because of the port coding sequence. There are 64K possible ports to span datagram. Jamming of transmission would be limiting the ability of the sniffer/listener. Also, the listener will find it difficult to use a man in the middle attach, since the data will be spread over multiple ports and only the receiver and transmitter will know the specific port sequencing for the datagram.
Mobile Internet Protocol (IP) Local Area Network (LAN) is a technique, developed by the U.S. Army Research Laboratory, which allows a LAN to be IP mobile when attaching to a foreign IP-based network and using this network as a means to retain connectivity to its home network. In this paper, we describe a technique that uses Open Secure Shell (OpenSSH) software to ensure secure, encrypted transmission of a mobile LAN's network traffic. Whenever a mobile LAN, implemented with Mobile IP LAN, moves to a foreign network, its gateway (router) obtains an IP address from the new network. IP tunnels, using IP encapsulation, are then established from the gateway through the foreign network to a home agent on its home network. These tunnels provide a virtual two-way connection to the home network for the mobile LAN as if the LAN were connected directly to its home network. Hence, when IP mobile, a mobile LAN's tunneled network traffic must traverse one or more foreign networks that may not be trusted. This traffic could be subject to eavesdropping, interception, modification, or redirection by malicious nodes in these foreign networks. To protect network traffic passing through the tunnels, OpenSSH is used as a means of encryption because it prevents surveillance, modification, and redirection of mobile LAN traffic passing across foreign networks. Since the software is found in the public domain, is available for most current operating systems, and is commonly used to provide secure network communications, OpenSSH is the software of choice.
Telepresence, especially in the form of interactive video, is becoming critical to modern warfare, delivering intelligence and enhancing situation awareness. This paper describes an interactive video communication system, Indirect Networked Target-Enhanced Robust Video-Interaction Telepresence (INTERVIT), that has been developed at Physical Optics Corporation (POC). The system consists of ground visual sensors, real-time video compression and target assessment and tracking, and secure wireless transmission. The paper describes the design, fabrication and test for an implemented experimental INTERVIT system.
Networks of large numbers of embedded systems, such as those in sensor networks, will require automatic and efficient means for configuring TCP/IP network data paths and for handling dynamic changes in network topology. Recent Force XXI Tactical Internet experiments have exposed numerous problems in attempting to use standard internetworking techniques to handle the special requirements of these kinds of networks. This paper describes the difficulties inherent in using standard Internet protocols in ad hoc networks, the state-of-the-practice of data routing in current military networking environments, and summarizes the results of recent research that show promise for handling current problems and inefficiencies.
The Common Operational Picture (COP) capability can be defined as the ability to display on a single screen integrated views of the Recognized Maritime, Air and Ground Pictures, enriched by other tactical data, such as theater plans, assets, intelligence and logistics information. The purpose of the COP capability is to provide military forces a comprehensive view of the battle space, thereby enhancing situational awareness and the decision-making process across the military command and control spectrum. The availability of a COP capability throughout the command structure is a high priority operational requirement in NATO. A COP capability for NATO is being procured and implemented in an incremental way within the NATO Automated Information System (Bi-SC AIS) Functional Services programme under the coordination of the NATO Consultation, Command and Control Agency (NC3A) Integrated Programme Team 5 (IPT5). The NATO Initial COP (iCOP) capability project, first step of this evolutionary procurement, will provide an initial COP capability to NATO in a highly pragmatic and low-risk fashion, by using existing operational communications infrastructure and NATO systems, i.e. the NATO-Wide Integrated Command and Control Software for Air Operations (ICC), the Maritime Command and Control Information System (MCCIS), and the Joint Operations and Intelligence Information System (JOIIS), which will provide respectively the Recognized Air, Maritime and Ground Pictures. This paper gives an overview of the NATO Initial COP capability project, including its evolutionary implementation approach, and describes the technical solution selected to satisfy the urgent operational requirement in a timely and cost effective manner.
The paper describes the findings and approach of Ex NEAR HORIZONS, which as part of a series of trials, aimed to explore the performance characteristics and potential operational benefits of a number of technology inserts for the UK Digitization Programme. Although the exercise contained 5 discrete options (hypotheses) for improvement in Command, Control, Communications, Computing and Information (C4I) this paper explores only two of these: a web-based approach and the provision of technology to support distributed and co-located collaborative team working. Despite the commercial world moving towards an information exchange model based on publish and subscribe, the trial found that, although the concept was well received, the implications for changes in organsiation and process were substantial. When working collaboratively in a distributed environment, the findings indicate difficulties in gaining an initial shared understanding of the situation and to exercise command. The participants were a wide range of regular British Army Officers, not only to provide broad views on current military benefits but also to move away from the traditional trials, which tend to expose a single HQ, with prescriptive processes and organizations to the technology. The innovative trial was considered to have been very successful, gathering a considerable body of valuable data and identifying clear paths for exploitation of information technologies to support the military decision- maker. The paper extrapolates the findings of the trial to provide comment on the potential difficulties facing the concept of Network Centric Warfare.
Information Technologies are transforming the modern battlefield. Advances in networking and communications technologies have made it possible to communicate with virtually anyone, anywhere, at any time. As the network foundation grows, so does the need for a set of technologies to enable interoperability amongst elements on this network. Service-based architectures are the next evolutionary step in system and software architectures and are particularly applicable to the challenges faced in Network Centric Warfare. This paper explores how service-based technologies apply to the various levels of architecture within a Network-Centric battlefield.
A fuzzy logic based expert system for resource management has been developed that automatically allocates electronic attack (EA) resources in real-time over many dissimilar autonomous naval platforms defending their group against attackers. The platforms can be very general, e.g., ships, planes, robots, land based facilities, etc. Potential foes the platforms deal with can also be general. This paper provides an overview of the resource manager including the four fuzzy decision trees that make up the resource manager; the fuzzy EA model; genetic algorithm based optimization; co-evolutionary data mining through gaming; and mathematical, computational and hardware based validation. Methods of automatically designing new multi-platform EA techniques are considered. The expert system runs on each defending platform rendering it an autonomous system requiring no human intervention. There is no commanding platform. Instead the platforms work cooperatively as a function of battlespace geometry; sensor data such as range, bearing, ID, uncertainty measures for sensor output; intelligence reports; etc. Computational experiments will show the defending networked platform's ability to self- organize. The platforms' ability to self-organize is illustrated through the output of the scenario generator, a software package that automates the underlying data mining problem and creates a computer movie of the platforms' interaction for evaluation.
Light Armoured Vehicles (LAVs) are being developed to meet the modern requirements of rapid deployment and operations other than war. To achieve these requirements, passive armour is minimized and survivability depends more on sensors, computers and countermeasures to detect and avoid threats. The performance, reliability, and ultimately the cost of these components, will be determined by the trends in computing and communications. These trends and the potential impact on DAS (Defensive Aids Suite) development were investigated and are reported in this paper. Vehicle performance is affected by communication with other vehicles and other ISTAR (Intelligence, Surveillance, Target Acquisition and Reconnaissance) battlefield assets. This investigation includes the networking technology Jini developed by SUN Microsystems, which can be used to interface the vehicle to the ISTAR network. VxWorks by Wind River Systems, is a real time operating system designed for military systems and compatible with Jini. Other technologies affecting computer hardware development include, dynamic reconfiguration, hot swap, alternate pathing, CompactPCI, and Fiber Channel serial communication. To achieve the necessary performance at reasonable cost, and over the long service life of the vehicle, a DAS should have two essential features. A fitted for, but not fitted with approach will provide the necessary rapid deployment without a need to equip the entire fleet. With an expected vehicle service life of 50 years, 5-year technology upgrades can be used to maintain vehicle performance over the entire service life. A federation of modules instead of integrated fused sensors will provide the capability for incremental upgrades and mission configurability. A plug and play capability can be used for both hardware and expendables.
Traditional display, control and situational awareness technologies may not allow the fighting vehicle commander to take full advantage of the rich data environment made available in the net-centric battle field of the future. Indeed, the sheer complexity and volume of available data, if not properly managed, may actually reduce crew performance by overloading or confusing the commander with irrelevant information. New techniques must be explored to understand how to present battlefield information and provide the commander with continuous high quality situational awareness without significant cognitive overhead. Control of the vehicle's many complex systems must also be addressed the entire Soldier Machine Interface must be optimized if we are to realize the potential performance improvements. Defence Research and Development Canada (DRDC) and General Dynamics Canada Ltd. have embarked on a joint program called Future Armoured Fighting Vehicle Systems Technology Demonstrator, to explore these issues. The project is based on man-in-the-loop experimentation using virtual reality technology on a six degree-of-freedom motion platform that simulates the motion, sights and sounds inside a future armoured vehicle. The vehicle commander is provided with a virtual reality vision system to view a simulated 360 degree multi-spectrum representation of the battlespace, thus providing enhanced situational awareness. Graphic overlays with decision aid information will be added to reduce cognitive loading. Experiments will be conducted to evaluate the effectiveness of virtual control systems. The simulations are carried out in a virtual battlefield created by linking our simulation system with other simulation centers to provide a net-centric battlespace where enemy forces can be engaged in fire fights. Survivability and lethality will be measured in successive test sequences using real armoured fighting vehicle crews to optimize overall system effectiveness.
Proc. SPIE 4741, Measured results of an efficient broadband HF antenna system for reliable all-terrain communication between unattended ground sensors, 0000 (6 August 2002); https://doi.org/10.1117/12.478708
The purpose of this paper is to verify a prior study of propagation losses in various environments for unattended ground sensors operating in the HF, VHF, and UHF bands. At each frequency band, tests were conducted and measurements taken to provide the actual range for these radios. Measured and predicted results are well correlated and help support the idea that for a network of unattended ground sensors, HF and lower VHF frequencies have better propagation characteristics in a wide range of environments than does the UHF band.
Target location is a problem where the application of multiple sensors that are geographically distributed can determine or improve the location estimate of a target. If these sensors are capable of cooperative behaviour then the information from each sensor can be autonomously fused to provide an estimate of the target position. The individual sensors may be quite unsophisticated, yet the observation system that is created through cooperation and adaptive networking of these sensors provides sufficient process gain to achieve target location accuracies similar to those of expensive centralized sensor systems. The accuracy of target location estimates depends heavily on the separation distance between the sensors. Large baseline geometry takes advantage of many seemingly unsophisticated bearing measurements that are organised into a coordinated observation system to locate a target. Team formation is one method to address coordination of distributed sensors, data fusion, sensor resource and energy management, and communication link control based on the concept of cooperating machines1,2,3. We apply an algorithm for agent team formation4 inspired by the self-organising behaviour observed in colonies of ants, to the problem of integrating the sensors of a group of networked mini-Autonomous Air Vehicles (AAVs). The mini-AAVs are tasked to locate targets within a region of interest. The challenge we address is to make the location estimation system adaptive to a dynamic environment and robust to failure. Simulation results are presented which address issues in distributed data fusion, sensor resource and energy management, and communication link control, for a group of mini-AAVs.
The Smart Sensor Web (SSW) project was a two year effort sponsored by the Deputy Undersecretary of Defense for Science and Technology (DUSD(S&T)). The vision of the SSW is an intelligent, web-centric distribution and fusion of sensor information that provides greatly enhanced local situational awareness, on demand, to warfighters at lower echelons (battalion/squadron and below). The project examined critical technical issues associated with developing such a system in a joint operational context, including Army, Marine, Air Force and SOF elements. Key constraints in an SSW system include energy, communications bandwidth, latency, and information presentation. This analysis is focused on information generation as far forward as possible to minimize bandwidth requirements and maximize the use of continually improving processing and memory capability. It also focuses on the problem of information fusion and presentation, ensuring that only mission relevant and understandable information is presented to the warfighter. The key mechanism for addressing these concepts is the SSW test bed, a combination of virtual and live assets. Two operational vignettes were used during the second experiment using the test bed: (1) dismounted infantry conducting operations on urban terrain, and (2) the employment of wide-area search munitions such as the Air Force's Low Cost Autonomous Attack System (LOCAAS) in a cooperative attack environment. This paper will focus on the concept for the experiment, some of the key technical issues addressed, the interplay of the simulation methods used, and results from the final live experiment conducted in January 2002.
Time-difference of arrival (TDOA) estimates are an attractive means for geolocation of targets via low-cost, distributed, single-element acoustic sensors. Relative to distributed beamforming approaches. TDOA localization requires significantly less bandwidth between sensor nodes and exhibits greater tolerance to uncertainties in sensor node location and data synchronization. In this paper, we present algorithms for estimating TDOA over low-bandwidth links, and for combining these estimates to provide geolocation of targets. Both of these components are adapted specifically to operation in a low-power, low-bandwidth distributed sensor environment. TDOA estimation is performed using spectral peaks from the acoustic signals, which allows drastic reduction in the bandwidth required to collaboratively determine bearing to target. Previously published localization algorithms were modified to minimize the required communication bandwidth and to support scaling of the algorithm to many distributed nodes. The performance of the various localization algorithms is simulated and compared for several scenarios. The preferred algorithms are also applied to pre-recorded field data and the resulting geolocation estimates are compared to ground truth data.
A Cooperative Sensor Network is an array of sensors interconnected by a local or wide area tactical communications network. Sensor data is shared between the sensors and used as input to an estimator to measure a process such as target location or target identification. In this paper a stochastic sensor scheduling framework is applied to the position estimation of multiple emitter targets using a Cooperative Sensor Network where the communications bandwidth between sensor nodes has been constrained. The stochastic sensor scheduling problem is presented and a practical suboptimal sensor scheduling algorithm for multi-target localisation using bearing-only sensors is demonstrated in simulation. Cooperative Sensor Networks are applied to the Integrated Surveillance, Reconnaissance and Electronic Warfare (ISREW) problem. One way to achieve ISREW is to develop distributed EW and S&R systems using Cooperative Sensor Networks. In this paper we describe a new approach to tactical sensor network and information processing based on stochastic sensor scheduling techniques. Simulation results are presented for a number of examples of cooperative EW and Surveillance systems.
We have been developing a decentralised architecture for data fusion for several years. In this architecture, sensing nodes, each with their own processing, are networked together. Previously, we have researched fully connected networks, tree-connected networks, and networks with loops, and have developed a range of theoretical and empirical results for dynamic networks. Here we report the results obtained from building and demonstrating a decentralised data fusion system in which the nodes are connected via an ad hoc network. Several vision based tracking nodes are linked via a wireless LAN. We use UDP to establish local routing tables within the network whenever a node joins, and TCP/IP to provide point to point communications within the network. We show that the resulting data fusion system is modular, scalable and fault tolerant. In particular, we demonstrate robustness to nodes joining and leaving the network, either by choice or as a result of link drop-out. In addition to experimental results from the project, we present some thoughts on how the technology could be applied to large scale, heterogeneous sensor networks.
In a typical security and monitoring system a large number of networked cameras are installed at fixed positions around a site under surveillance. There is generally no global view or map that shows the guard how the views of different cameras relate to one another. Individual cameras may be equipped with pan, tilt and zoom capabilities, and the guard may be able to follow an intruder with one camera, then pick him up with another. But such tracking can be difficult, and hand off between cameras disorienting. The guard does not have the ability to continually shift his viewpoint. More over current systems do not scale up with the number of cameras. The system becomes more unwieldy as cameras are added to the system. In this paper, we will present the system and key algorithms for remote immersive monitoring of an urban site using a blanket of video cameras. The guard monitors the world using a live 3D model, which is constantly being updated from different directions using the multiple video streams. The world can be monitored remotely from any virtual viewpoint. The observer can see the entire scene from far and get a bird's eye view or can fly/zoom in and see activity of interest up close. A 3D-site model is constructed of the urban site and used as glue for combining the multiple video streams. Moreover each of the video cameras has smart image processing associated with it, which allows it to detect moving and new objects in the scene and recover their 3D geometry and pose of the camera with respect to the world model. Each video stream is overlaid on top of the video model using the recovered pose. Virtual views of the scene are generated by combining the various video streams, the background 3D model and the recovered 3D geometry of foreground objects. These moving objects are highlighted on the 3D model and used as a cue by the operator to direct his viewpoint.
The performance of three distributed sensor fusion network architectures is investigated: a fully-connected and a partially-connected measurement fusion system and a partially-connected track fusion system. The investigation employs an advanced military scenario generator, FLAMES, which was customised for exercising a range of distributed data fusion experiments. Specifically, it includes a representative model of the delays in a communication system (such as JTIDS or Link 16). Here the delays were used to modify communication bandwidth and to evaluate how this affected the performance of the fusion architectures/algorithms. Under certain specific scenario conditions, it was found that decentralised measurement fusion system was severely affected by reduced bandwidth. This is because each node loads its communication buffer with every measurement and consequently some measurements are never transmitted. The decentralised track fusion system exhibits improved performance because it lumps measurements into tracks and thereby it makes much more effective use of the bandwidth. Moreover, it was found that the performance of the partially connected decentralised track fusion system was very close to the optimal performance achieved by the fully-connected decentralised measurement fusion system.
Automatic aircraft recognition is very complex because of clutter, shadows, clouds, self-occlusion and degraded imaging conditions. This paper presents an aircraft recognition system, which assumes from the start that the image is possibly degraded, and implements a number of strategies to overcome edge fragmentation and distortion. The current vision system employs a bottom up approach, where recognition begins by locating image primitives (e.g., lines and corners), which are then combined in an incremental fashion into larger sets of line groupings using knowledge about aircraft, as viewed from a generic viewpoint. Knowledge about aircraft is represented in the form of whole/part shape description and the connectedness property, and is embedded in production rules, which primarily aim at finding instances of the aircraft parts in the image and checking the connectedness property between the parts. Once a match is found, a confidence score is assigned and as evidence in support of an aircraft interpretation is accumulated, the score is increased proportionally. Finally a selection of the resulting image interpretations with the highest scores, is subjected to competition tests, and only non-ambiguous interpretations are allowed to survive. Experimental results demonstrating the effectiveness of the current recognition system are given.
Computer modeling programs such as the Battlescale Forecast Model are capable of generating three-dimensional (3-D) Meteorological (Met) data with sufficiently fine spatial resolution to warrant a study of data compression methods for efficient storage and/or transmission of this data. This paper illustrates the potential benefits of applying lossy/irreversible data compression techniques to such Met variables as air pressure. Because of the advanced state of development of digital image compression methods such as the JPEG 2000 algorithm which is already an international standard, the approach considered and illustrated in this paper uses the two-dimensional (2-D), single-component JPEG 2000 algorithm on horizontal 2-D slices of data. Much better results are obtained by first pre-processing the 3-D data in the vertical direction by applying a one-dimensional, energy compacting, reversible linear transformation. The best possible pre-processing which involves the Karhunen-Loeve Transform which is shown to increases compression ratios for the same signal-to-noise ration (SNR) by a factor of 10 over the 2-D (no pre-processing) approach. Alternatively, for the same bit rate, the SNR is improved by up to 40 dB.
A method for detecting buried mines in ground penetrating radar (GPR) data using a Hough transform approach is described. GPR is one of three sensors used in the Mine Hunter/Killer (MH/K) system for detecting buried mines. A buried mine modeled as a point scatterer in object space gives rise to a hyperbolic response in GPR measurement space. Our approach uses the Hough transform to recover the object space representation (i.e., the location of mines in x, y, and depth) from the GPR data, in effect 'deconvolving' the response of the radar. This is done by having each point in measurement space vote for all points in object space where the mine could be located. Against a baseline energy detector, the Hough algorithm shows a one half order reduction in false alarm rate at a fixed probability of detection for low metal, metal, and non metal mines.
Tensor magnetic gradiometry reports accurate direction and approximate range to an UXO using a single point measurement. Instead of using a sparse grid of points, a UAV can fly a series of adjacent tracks and acquire the data needed to locate all items. Some recent modeling suggests that normal UAV flight speeds can achieve an area coverage rate of 15 acres/minute. This contrasts sharply with total field magnetic techniques that require a much denser grid of sensor tracks. Synthetic data from equivalent surface-based measurements and simulations of real time processing substantiate the models.
The major focus of the paper is on the use of remote sensing systems in providing planning and advise in support of ground operations at the World Trade Center site and the related debris processing and disposal sites in the New York area. A summary of the World Trade Center recovery effort is presented. This was the largest most complex recovery effort of this nature ever to occur in the United States. Remote sensing was only a part of the total recovery activity but did provide important assistance throughout the recovery operation. Samples of geospatial technologies used in the recovery are reviewed. These include 3-D visualization, thermal infrared imagery, LIDAR data systems, IKONOS one- meter panchromatic imagery, SPOT imagery and the use of digital aerial imagery. The general area of disaster response management is also addressed and the findings of various studies in this area are related to the World Trade Center disaster. Observations and lessons learned from the World Trade Center disaster response are discussed with recommendations for the use of remote sensing systems and products in future disaster situations are presented.
The Naval Research Laboratory's airborne WAR HORSE sensor incorporates a hyperspectral line-scan sensor, a high- resolution video line-scanner, and a CMIGITS INS/GPS unit. Targets are detected in real time from the hyperspectral data, and images of the detected targets are chipped from the high-resolution video data for presentation to an operator. The INS/GPS data are used to geo-spatially register (georegister) both the hyperspectral data and the video chips. In this paper we show detection results for processing the hyperspectral data both before and after geo- spatial registration when assumed target size is incorporated into the detection algorithms. Then we illustrate the utility of presenting target image chips which are geo-spatially registered and fused with the hyperspectral data.
We detected roads in aerial imagery based on multiresolution linear feature detection. Our method used the products of wavelet coefficients at several scales to identify and locate linear features. After detecting possible road pixels, we used a shortest-path algorithm to identify roads. The multiresolution approach effectively increased the size of the region we examined when looking for possible road pixels and reduced the effect of noise. We found that our approach leads to an effective method for detecting roads in aerial imagery.
Novel statistical modeling and training techniques are proposed for improving classification accuracy of land cover data acquired by LandSat Thermatic Mapper (TM). The proposed modeling techniques consist of joint modeling of spectral feature distributions among neighboring pixels and partial modeling of spectral correlations across TM sensor bands with a set of semi-tied covariance matrices in Gaussian mixture densities (GMD). The GMD parameters and semi-tied transformation matrices are first estimated by an iterative maximum likelihood estimation algorithm of Expectation- Maximization, and the parameters are next tuned by a minimum classification error training algorithm to enhance the discriminative power of the statistical classifiers. Compared with a previously proposed single-pixel based Gaussian mixture density classifier, the proposed techniques significantly improved the overall classification accuracy on eight land cover classes from imagery data of Missouri state.
This paper presents predicted performance for area-based image matching where two images of a non-planar object model differ by a general perspective geometric transformation. The study shows there exists a window size that will maximize or minimize certain performance parameters for a given perspective distortion and object planarity variance. The analysis also indicates that for a given perspective distortion where pitch angle is the only parameter, many performance criterion have an optimum window size if the object model is allowed to vary. The performance measures examined are expected peak value, peak-to-sidelobe ratio (PSR), probability of acquisition (PCA), and image registration error covariance. Window adaptation based on precomputed metrics is applied to extend distortion tolerance. Statistically consistent image sets are geometrically transformed by a general perspective spatial mapping using statistically consistent independent non- planar object models with arbitrary generalized autocorrelation functions. The two images are then registered through an image matching technique, the defining functions analyzed and limitations on the amount of perspective viewpoint change of an imaging system in an aerial tactical arena are given while still allowing proper image correspondence. Monte Carlo simulation verification of theoretical predictions and results are extended to a variety of common area-based image matching techniques.
A variety of source data, hardware/software packages, methodology, and techniques exist for the capture, rendering, and navigation of complex three-dimensional (3D) urban terrain. This paper investigates issues, options, and concerns inherent in the movement from 2D to 3D visualization and finally to a display environment in advanced virtual worlds exhibiting four-dimensional (4D) urban terrain, the fourth dimension being time. Methodology options are presented as a matrix for 3D feature development. Proposals and recommendations as to future avenues of research in the subject arena are presented as a roadmap to meeting the challenges of visualization for urban operational planning. Applications include humanitarian, emergency, recovery and relief operations, force protection, threat assessment, pre- and post-incident response, mission planning and rehearsal, law enforcement, and Homeland Defense.
Video imagery collection and analysis is used to understand conditions and dynamic changes within the home environment. This is part of a joint venture between George Washington University and America Online to design, develop and evaluate new technology for the Home of the 21st Century. This initial phase is focused on capture and distribution of image information within the home and over the internet to allow local and remote observation and control of in home systems.
In this paper, a new method is proposed for supervised classification of ground cover types by using polarimetric synthetic aperture radar (SAR) data. The concept of similarity parameter between two scattering matrices is introduced and it is shown to be able to maintain some intrinsic properties of scattering mechanism. Four similarity parameters of each pixel in image are used for classification. The scattering matrix span of each pixel is also used to establish the feature space. The principal component analysis is adopted for extracting the feature transform vector and for making classification decision. The classification result of the new method is given with comparison to that of the maximum likelihood method, demonstrating the effectiveness of the proposed scheme.
Recent advances in remote sensing have led the way for the development of hyperspectral sensors and the applications of the hyperspectral data. Hyperspectral remote sensing is a relatively new technology, which is currently being investigated by researchers and scientists with regard to the detection and identification of minerals, terrestrial vegetation, and man-made materials and backgrounds. The airborne hyperspectral imaging data have operationally been used to a number of land-use, natural environment, geology, agriculture and other studies. In this study, airborne hyperspectral imaging data were tested in vegetation and man-made object identification. Natural grassland and artificial grassland, different types of crops, different types of forest and bush, different types of metal slabs in construction project have been precisely classified and greatly identified. In these works, the Operational Modular Imaging Spectrometer (OMIS) provides the imaging spectrometer data. OMIS has 128 spectral bands, including visible, short wave infrared, middle infrared and thermal infrared spectral region. Results suggest that hyperspectral imaging data, especially with short wave infrared and thermal infrared wavelength, have broad application perspectives in object identification.
The aim of investigation was developing the registration algorithms for the aerial and cosmic pictures taken in different seasons from differing viewpoints, formed by differing kinds of sensors (visible, IR, SAR). Structural matching was chosen as the only approach that could manage with the mentioned image differences. In the contrast to the target dependent structural analysis applied in the expert systems we dealt with the images of arbitrary content, thus the rigidity and opacity of landscape objects and the rules of shadowing were the only limitations applied. The simple contour elements corresponding to the objects borders were judged to be a most stable source of structural descriptions in this uncertain situation. However a large amount of similar simple elements produced a high dimensional structural matching task. It decreased the reliability of matching and exponentially increased the computational expenditures. We solved this problem building the hierarchical structural descriptions. We separated the structural elements to the rather small local groups and structurally matched them using the special tree walking algorithms. Two grouping approaches were applied: uniting the elements belonging to a continuous contour line, or uniting the ones situated in a separate compact region. The matching results were reliable both for the multiple season and multiple sensor images. The first approach demonstrates a slightly better precision, while the second one is slightly more robust and flexible, thus it can deal also with the structural elements of other nature: the compact regions formed in result of texture segmentation were also successfully structurally matched.