This paper describes an autonomous navigation system capable of exploring an unknown environment, as implemented by the Advanced Technology Centre of BAE SYSTEMS (ATC). An overview of the enabling technology of the autonomous system, simultaneous localization and mapping (SLAM), is given before describing in detail the utility functions used to perform the strategic decision making required for autonomous exploration. Relevant studies, performed in simulation, of the major issues involved in multi-platform SLAM are also described. Initial results and conclusions are given at the end of the paper. All experiments are conducted on the Pioneer all terrain mobile robots using common off the shelf sensing devices.
Network-centric architectures are defined by the complete absence of a traditional central data fusion site and also, in general, a central communication facility. Instead, the data fusion is performed at each network node and these nodes communicate on a strictly point-to-point basis. The network topology, which may be dynamic, is assumed to be unknown. These governing constraints imply a fault-tolerant, scalable, and modular system. However, such systems are prone to possible inconsistent fused estimates as a consequence of the well-known rumor propagation problem. The algorithmic challenge is to combat this problem without sacrificing the aforementioned benefits. This has led to the formulation of a technique known as Covariance Intersection (CI). Most recently, CI has been integrated with the Kalman filter to produce the Split CI algorithm - a general solution to decentralised data fusion in arbitrary communication networks. These algorithms have not yet been evaluated outside of a limited simulation environment. The purpose of this paper is to present a study of their relative performance in a hardware-based decentralised sensor network system.
The paper will describe a number of indoor experiments that involve tracking a ground target by means of multiple, networked, wall-mounted cameras. High precision ground truth target positions are available from a laser-tracking device. The experiments will evaluate Kalman, CI, and Split CI algorithm performance - measured in terms of consistency, convergence and accuracy - with respect to a range of static and dynamic network topologies.
KEYWORDS: Sensors, Data fusion, MATLAB, Data communications, Cameras, Telecommunications, Detection and tracking algorithms, Target detection, Sensor networks, Imaging systems
In this paper we describe a simple physical test-bed, developed to allow practical experimentation in the use of Decentralised Data Function (DDF) in sensor-to-shooter applications. Running DDF over an ad hoc network of distributed sensors produces target location information. This is used to guide a Leica laser-tracker system to designate currently tracked targets. We outline how the system is robust to network and/or node failure. Moreover, we discuss the system properties that lead to it being completely “plug-and-play”, as, like the distributed sensor nodes, the “shooter” does not require knowledge of the overall network topology and can connect at any point.
We have been developing a decentralised architecture for data fusion for several years. In this architecture, sensing nodes, each with their own processing, are networked together. Previously, we have researched fully connected networks, tree-connected networks, and networks with loops, and have developed a range of theoretical and empirical results for dynamic networks. Here we report the results obtained from building and demonstrating a decentralised data fusion system in which the nodes are connected via an ad hoc network. Several vision based tracking nodes are linked via a wireless LAN. We use UDP to establish local routing tables within the network whenever a node joins, and TCP/IP to provide point to point communications within the network. We show that the resulting data fusion system is modular, scalable and fault tolerant. In particular, we demonstrate robustness to nodes joining and leaving the network, either by choice or as a result of link drop-out. In addition to experimental results from the project, we present some thoughts on how the technology could be applied to large scale, heterogeneous sensor networks.
KEYWORDS: Robots, Computer architecture, Sensors, MATLAB, Laser range finders, Laser systems engineering, Feature extraction, Filtering (signal processing), Cameras, Control systems
Previously, we have developed techniques for Simultaneous Localization and Map Building based on the augmented state Kalman filter. Here we report the results of experiments conducted over multiple vehicles each equipped with a laser range finder for sensing the external environment, and a laser tracking system to provide highly accurate ground truth. The goal is simultaneously to build a map of an unknown environment and to use that map to navigate a vehicle that otherwise would have no way of knowing its location, and to distribute this process over several vehicles. We have constructed an on-line, distributed implementation to demonstrate the principle. In this paper we describe the system architecture, the nature of the experimental set up, and the results obtained. These are compared with the estimated ground truth. We show that distributed SLAM has a clear advantage in the sense that it offers a potential super-linear speed-up over single vehicle SLAM. In particular, we explore the time taken to achieve a given quality of map, and consider the repeatability and accuracy of the method. Finally, we discuss some practical implementation issues.
We have developed techniques for Simultaneous Localization and Map Building based on the augmented state Kalman filter, and demonstrated this in real time using laboratory robots. Here we report the results of experiments conducted out doors in an unstructured, unknown, representative environment, using a van equipped with a laser range finder for sensing the external environment, and GPS to provide an estimate of ground truth. The goal is simultaneously to build a map of an unknown environment and to use that map to navigate a vehicle that otherwise would have no way of knowing its location. In this paper we describe the system architecture, the nature of the experimental set up, and the results obtained. These are compared with the estimated ground truth. We show that SLAM is both feasible and useful in real environments. In particular, we explore its repeatability and accuracy, and discuss some practical implementation issues. Finally, we look at the way forward for a real implementation on ground and air vehicles operating in very demanding, harsh environments.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.