The Center for Self-Organizing and Intelligent Systems at Utah State University has been developing laser-based obstacle detection and avoidance during the past four years. The JPL initially encouraged us down this path to provide an optional software upgrade for its Rocky Rover. The USU system was based on a unique combination of fuzzy logic and a decision tree behavioral technique, and this system provided the Rocky Rover the option of venturing independently beyond the observance of the base station. Shortcomings in the data collection prompted a redesign that included more sophisticated line tracing and filtering methods to locate and track the actual laser lines. The signal to noise ratio problems combined with platform changes lead to the choice of a commercial range finder. The design challenge then became one of system integration including the meting of real time constraints on the current processor platform, and obstacle characterization and avoidance.
Recent advances in reconfigurable logic have enabled physical hardware to be reused for different application and sensor types. This hardware reuse" provides a significant cost reduction since a single hardware design can be reused and adapted to many other applications with a simple change in software" (FPGA configuration information). Present navigation systems consist of a central processor providing the algorithmic computations and application specific integrated circuits (ASIC) converting the sensor data to a useable computational format. Using reconfigurable logic allows the signal processing for each sensor to be performed in parallel and the design can be revised as needed without fabricating a new ASIC. Development of our system presently uses only accelerometer data in order to study reconfiguration aspects in detail without adding complications due to the presence of other types of sensors. For an accelerometer-only navigation system, the navigation is most easily implemented in two dimensions. A design for an interferometric and an intensity-based MEMS (Microelectromechanical Machined Sensor) accelerometer are described and compared to a capacitance-based MEMS accelerometer implementation . Plans for final implementation on a monolithic ceramic substrate which will incorporate both optical or capacitance-based sensors and reconfigurable hardware as a complete system are also described. Keywords: Reconfigurable Processing, Sensor Signal Processing, Navigation, Accelerometer
In recent years the demand has surged for low-power and small form-factor wireless communications devices. Coupled with the migration of desktop computing to mobile computing, a new market is merging for portable products that combine wireless communications and high-performance computing. The evolution of semiconductor device technology toward the deep submicron regime is enabling the development of CMOS RF communications circuits which are amenable to monolithic integration with existing mixed-signal processes. A 1.2- micron bulk CMOS process has been used to develop a monolithic architecture consisting of an 8-bit RISC microprocessor, a 256-byte SRAM memory, power management, and a 400-MHz RF transceiver. The logic portion of the IC operates at 50 MHz, and the RF transceiver achieves a data transmission rate of 16 kb/sec with a 10 kHz bandwidth. On- chip power regulation minimizes supply glitches due to logic switching, and power management is provided to minimize the standby power dissipation of idle components. This IC demonstrates the potential of CMOS to deliver a low-power architecture for computing and wireless communication.
Commercially available digital signal processors (DSPs) can be used to host state-of-the-art air acoustic adaptive beamforming algorithms in low power, low cost, real-time sensor systems. These systems are suitable for use as both unattended ground sensor and in vehicle mounted microphone array applications. This paper describes a compact state-of- the-art, real-time adaptive beamforming approach and sensor hardware for vehicle mounted array operation. Recent field test results are presented for detection, tracking, and classification results with the vehicle engine idling as well as with the vehicle 'on the move.' Tracking accuracy results are also presented. The system tested used an eight- microphone array on the SARGE with two additional microphones located near the engine and the exhaust for additional adaptive noise cancellation.
This paper is the result of Cybernet efforts starting in May of 1995 and currently on-going to develop system for capturing high resolution 3D terrain and cultural features from multiple position references video sequences. The notion is to test and implement a perception-based image rendering technique for terrain capture of real sites applicable to robotics, virtual reality and simulation applications. These techniques will enable the Army to better understand and evaluate the operational capabilities of vision based task performance. The proposed development effort has focused on the development and evaluation of image-based object constructions and rendering algorithms which will be summarized following.
Proc. SPIE 3366, New technologies to support NASA's Mission to Planet Earth satellite remote sensing product validation: use of an unmanned autopiloted vehicle (UAV) as a platform to conduct remote sensing, 0000 (12 August 1998); https://doi.org/10.1117/12.317561
As part of the US Global Change program, NASA has initiated its Mission to Planet Earth Program (MTPE) which requires continuous global satellite measurements over an extended 15 years period. Various US and International Earth Observing Satellites will be launched during this period. To ensure continuity of the measurements, a significant instrument calibration and product validation effort is required and is planned as part of this program. However, the validation of satellite products requires extensive ground truthing which is both costly and time consuming and in many cases limited to specific calibration/validation areas. Thus there is a need to extend this validation effort to include more participants and provide new, more cost effective technologies to support the validation effort. The use of unmanned autopiloted vehicles (UAV) and new miniature high performance instruments have been identified as providing this needed additional capability. This paper discusses the development of a UAV, associated avionics and preliminary remote sensing instruments to support the extension of ground truthing and product validation of NASA's MTPE Programs, specifically, earth observing system. The UAV being described is based on thrust vectoring capabilities and a single-axis pivoted wing or 'freewing' design. This unique UAV system is illustrated along with the proposed autonomous avionics and preliminary remote sensing payloads.
This paper reviews on-going collaborative efforts between the Colorado School of Mines and Clark-Atlanta University in cooperative assistance for coordination and control of multiple vehicles. It reports on progress in developing an intelligent assistance agent (IAA) for aiding a human operator in diagnosing problems and generating recovery strategies in remote ground robots. The current work has focused on the identification and incorporation of categories of additional information from multiple robots and other agents. These categories are: mission-related sources, such as peer robots working nearby; facility- related sources, such as security cameras; and opportunistically available agents, such as overhead satellites or humans working in the area as part of another mission. The incorporation of additional sources of information requires enhancement to the previously developed teleVIA architecture. In particular, the teleVIA, IAA must provide more strategic management support, sophisticate viewpoint and data management and presentation, and simplified control of the additional agents for diagnosis and recovery activities. These enhancements are encapsulated in software agents within the IAA.
This paper presents a method for remotely controlling articulated equipment such as robot arms through a 'manipulation' of their virtual representation by hand held actuators. The object is to take advantage of the superior hand/eye coordination of humans together with the flexibility of scale and imagery of virtual visualization. The main difficulty lies in effectively representing control limits to the operator. This is resolved through animated representations in the virtual space. The proposed technique has applications for high resolution equipment control in hostile environments or the control of very large or very small scale equipment.
The Center for Intelligent Systems has developed a small robotic vehicle named the Advanced Rover Chassis 3 (ARC 3) with six identical intelligent wheel units attached to a payload via a passive linkage suspension system. All wheels are steerable, so the ARC 3 can move in any direction while rotating at any rate allowed by the terrain and motors. Each intelligent wheel unit contains a drive motor, steering motor, batteries, and computer. All wheel units are identical, so manufacturing, programing, and spare replacement are greatly simplified. The intelligent wheel concept would allow the number and placement of wheels on the vehicle to be changed with no changes to the control system, except to list the position of all the wheels relative to the vehicle center. The task of controlling the ARC 3 is distributed between one master computer and the wheel computers. Tasks such as controlling the steering motors and calculating the speed of each wheel relative to the vehicle speed in a corner are dependent on the location of a wheel relative to the vehicle center and ar processed by the wheel computers. Conflicts between the wheels are eliminated by computing the vehicle velocity control in the master computer. Various approaches to this distributed control problem, and various low level control methods, have been explored.
For planetary surface operations, the European Space Agency initiated a development for teleoperated mini-rovers. Remote control functions related to autonomous reaction capabilities and sensor data processing on-board the vehicle exhibit interesting transfer potential to industrial and educational teleoperation tasks. Similar requirements to the space application arise in particular, when low cost communication links are used for teleservicing. This paper reviews the operational concept for the Mars rover and its operations test environment. The technology transfer to terrestrial teleservicing applications is analyzed, regarding remotely controlled equipment or robots. This is illustrated at the example of pipe inspection robots, industrial transport robots and virtual laboratories for educational purposes.
A systematic approach to ground vehicle automation is presented, combining low-level controls, trajectory generation and closed-loop path correction in an integrated system. Development of cooperative robotics for precision agriculture at Utah State University required the automation of a full-scale motorized vehicle. The Triton Predator 8- wheeled skid-steering all-terrain vehicle was selected for the project based on its ability to maneuver precisely and the simplicity of controlling the hydrostatic drivetrain. Low-level control was achieved by fitting an actuator on the engine throttle, actuators for the left and right drive controls, encoders on the left and right drive shafts to measure wheel speeds, and a signal pick-off on the alternator for measuring engine speed. Closed loop control maintains a desired engine speed and tracks left and right wheel speeds commands. A trajectory generator produces the wheel speed commands needed to steer the vehicle through a predetermined set of map coordinates. A planar trajectory through the points is computed by fitting a 2D cubic spline over each path segment while enforcing initial and final orientation constraints at segment endpoints. Acceleration and velocity profiles are computed for each trajectory segment, with the velocity over each segment dependent on turning radius. Left and right wheel speed setpoints are obtained by combining velocity and path curvature for each low-level timestep. The path correction algorithm uses GPS position and compass orientation information to adjust the wheel speed setpoints according to the 'crosstrack' and 'downtrack' errors and heading error. Nonlinear models of the engine and the skid-steering vehicle/ground interaction were developed for testing the integrated system in simulation. These test lead to several key design improvements which assisted final implementation on the vehicle.
The Redrover, Redrover project implements a control system that runs over the Internet. The goal of the project is to provide an inexpensive educational tool that introduces elementary and middle school aged students to robotic vehicles, the Internet, concepts of remote control, and planetary exploration. The control loop consists of a server and client(s), which communicate locally or through the Internet. The server receives move commands from the client and sends back images and sensor data. The server part of the program runs on a computer that is connected to a rover built by using Lego blocks. The control interface between the computer and the rover turns on and off the drive motors and collects data from the sensors. The client user interface was first based on HTML pages, subsequently it was rewritten to increase the speed of the application and create a more consistent user interface.Advantages of using the Internet for remote control are cheap implementation and universal availability. The disadvantage is the generally low bandwidth of the Internet resulting in low data transfer rates and uneven response times.
Antipersonal mines kill or mutilate tens of people every day. Military mine clearance methods make use of heavy vehicles and cannot achieve a satisfying destruction percentage. Consequently, humanitarian deminers use the classical manual methods. This work is very tedious, dangerous and costly. The detection is not always reliable and is very slow. Improvements can be made by developing new sensors and by automating the detection sequence. The Robotics workgroup of the Belgian Hudem '97 project focuses on the development of low cost vehicles that could be equipped with mine detectors. In this paper, we first review the main characteristics of such platforms. We then show that the architecture of the vehicle is also determined by the chosen detection strategy. Afterwards, we describe in detail the original wheeled vehicle we are building. Its main characteristic is a very high mobility given by the 3 drive/steer wheels. The low-level control is made by a microcontroller, while paths are generated by the remote computer. Preliminary indoor test have demonstrated the potentialities of this vehicle.
Regulatory agencies are imposing limits and constraints to protect the operator and/or the environment. While generally necessary, these controls also tend to increase cost and decrease efficiency and productivity. Intelligent computer systems can be made to perform these hazardous tasks with greater efficiency and precision without danger to the operators. The Idaho National Engineering and Environmental Laboratory and the Center for Self-Organizing and Intelligent Systems at Utah State University have developed a series of autonomous all-terrain multi-agent systems capable of performing automated tasks within hazardous environments. This pare discusses the development and application of cooperative small-scale and large-scale robots for use in various activities associated with radiologically contaminated areas, prescription farming, and unexploded ordinances.
In the distributed computing domain a framework is defined as a software environment that simplifies application development and management. Frameworks are categorized in three distinct classes: infrastructure, integration and enterprise. Using an enterprise wide framework approach, weapon system development and maintenance can be simplified and can lead to significant advantages such as extensibility and interoperability. The weapon system technical architecture working group has developed such a framework to infuse interoperability and extensibility among all weapon systems within the army. Application of the concepts of generic open architecture within this framework leads to extensibility. This paper reports on the initial stage of an effort to develop a framework for weapon system.
The suitability of reference map used in scene matching based navigation of unmanned vehicle is very important in the selection of reference map. Due to the sensitivity of scene imaging, only some special zones of unmanned vehicle's planned area are suitable for scene matching based navigation. In this paper, first the basic idea of scene matching based navigation is described. Secondly correct matching probability at each position is defined and estimated from statistical parameters of sensed image size window image. In order to decrease the computational cost of parameter calculation, their fast algorithms are presented. Then a description of navigation suitability of reference map, called suitability probability, is proposed. The experimental results on satellite images demonstrated the feasibility of our approach.
Utilizing multiple cooperating autonomous vehicles to perform tasks enhances robustness and efficiency over the use of a single vehicle. Furthermore, because autonomous vehicles can be controlled precisely and their status known accurately in real time, new types of cooperative behaviors are possible. This paper presents a working system called MEPS that plans and executes missions for multiple autonomous vehicles in large structured environments. Two generic spatial tasks are supported, to sweep an area and to visit a location while activating on-board equipment. Tasks can be entered both initially by the user and dynamically during mission execution by both users and vehicles. Sensor data and task achievement data is shared among the vehicles enabling them to cooperatively adapt to changing environmental, vehicle and tasks conditions. The system has been successfully applied to control ATV and micro-robotic vehicles in precision agriculture and waste-site characterization environments.
This paper presents the method of map-building by use of the AUV that has avoidance ability. First, the method with which the free-space is distinguished from obstacle-space by use of sonar detective distance is discussed in detail after the avoidance strategies of the AUV are presented. The grid is adopted to represent the detective area according to detective results. Then image-processing technique is taken to eliminate the noise and extract features from mapping results. Finally, the shape of obstacle is identified and the detailed simulated results are given in this paper.
The local path planning of mobile robots can be regarded as finding a mapping from perception space to action space. Genetic algorithm is used to search optimal mapping in this paper so as to improve the obstacle avoidance ability of the robot. In this paper, the rotational angle and translation distance of the robot is divided into seven and four grades respectively. In addition, the length of the path that the robot covers before collision with obstacle is taken as fitness. The robot can learn to carry out local path planning through selection, crossover and mutation in genetic algorithm. The simulation results are given at the and of this paper.
Understanding the surrounding scene and identifying man-made structures is an important task in autonomous vehicle navigation in outdoor environments. The ability to generalize the task of landmark recognition to a variety of landmarks belonging to a particular class is a difficult problem. This paper will describe a system to perform landmark recognition. The techniques used in this system for visual cues include texture, edges, functional form, and image-based search methods. This paper represents a follow- on work presented last year at SPIE.
Planning for future military capabilities assumes the availability of highly mobile, agile, rapidly deployable forces. Incorporation of robotic systems into future force structure to reduce casualties, increase tactical reach, counter battle fatigue, and reduce logistics burdens for rapid reaction forces offers one potential pathway for achieving this goal. The Demo III Unmanned Ground Vehicle Program is designed to advance and demonstrate the technology required to develop future unmanned ground combat vehicles through three major thrusts: (1) concerted technology development; (2) modeling, simulation and experimentation; and (3) technology integration and evaluation with users. Demo III focuses on demonstration of technology that will enable the development of small, highly agile, unmanned vehicles capable of off-road, semi- autonomous operation at speeds of up to 32 km/hr during daylight and 16 km/hr at night by summer 2001.
This paper outlines the goals and work accomplished thus far for both the man-machine interface and mission planning elements of the experimental unmanned vehicle program. It is the gaol of the XUV program to make available to the user an interface and tools that will allow for seamless transition between mission planning, rehearsal, and execution on multiple collaborating autonomous vehicles in a platoon group.
The goal of the Demo III Program for Experimental Unmanned Vehicles (XUVs) is to develop and integrate technologies to provide a vehicle platform with autonomous capabilities. The platform will allow for the integration of modular mission packages to allow it to serve multiple needs across the battlefield. For demonstration purposes, the primary mission package will perform the functions of Reconnaissance, Surveillance, and Target Acquisition (RSTA). The RSTA mission package will provide the capability to conduct RSTA functions while both stationary and on the move. It will include a variety of sensor technologies along with signal and image processing capabilities to perform the RSTA mission. The paper will describe goals for the Demo III RSTA mission package, discuss the types of sensors being considered for platform integration, and summarize the RSTA related aspects of the recently awarded integration contract. Processing and algorithm capabilities required for the XUV to perform RSTA in an autonomous fashion, such as aided target recognition, motion detection, motion detection on the move, etc. will also be discussed.
The U.S. Army Tank-automotive and Armaments Command Research Development and Engineering Center (TARDEC) has iniliated a systems engineering program to develop mutually compatible and complementary components and subsystems for enhanced Unmanned ground vehicles (UGV) mobility. UGV have historically lacked the mobility of manned tracked vehicles, especially in obstacle-crossing and off-road maneuver. To achieve comparable mobility, UGV require supervised navigation with enhanced locomotion subsystems and complementary, mutually adapted machine perception, routing and driving control. The TARDEC program is funding technology development, maturation and packaging to produce NonDevelopmental Items that can be scaled and configured for different UGV. The TARDEC is also developing a Systems Integration Laboratory (SIL) to evaluate compatibility and system-level performance of component technologies. This TARDEC program complements and extends the UGV Technology Exchange and Exploitation (UGVTEE) DEMO III program. It fills unfunded gaps in the Army Research Laboratory's Concerted Technology Thrust (CU) projects in machine perception and supervised navigation. It fills technology gaps in the integration of "smart" mobility subsystems with supervised navigation. It specifically addresses semi-autonomous navigation deficiencies in obstacle crossing, other wheeled UGV mobility issues not being addressed in UGVTEE or other UGV programs. As part of the systems engineedng activity to ensure that the ND! components will be scaleable across a wide range of UGV, TARDEC is developing modular concepts for UGV across a range ofweight classes and functions. Keywords: Supervised navigation, obstacle crossing, non-developmental items, systems integration laboratory, unmanned ground vehicles, locomotion
The coordination of teams of ground vehicles is a critical problem for many military operations. When moving in teams through enemy territory, one of the more important operations is that of the bounding overwatch maneuver. This is when a team of vehicles moves one at a time, with the lead vehicle(s) moving forward under the protection of the trailing vehicle(s). The idea is that the trailing (stationary) vehicle(s) is in a better position to watch for danger and to engage the enemy than the moving vehicle(s). Currently, the computation of routes for such maneuvers was performed manually, in the field. In this paper we present a novel approach for coordinating multi-vehicle teams of robotic or semi-robotic ground vehicles performing bounding overwatch operations. Our approach generates accurate and efficient routes and takes into account such factors as terrain features, threat exposure, as well as vehicle and mission parameters. Furthermore, routes generated by our approach can be replanned "on the fly" as new environmental or threat information becomes available. Keywords: Autonomous Vehicles, Bounding Overwatch, Ground Vehicle Coordination, Path Planning, Robotics, Route Planning, Team Planning Operations
A test course is being implemented at the U.S. Army's Aberdeen Proving Ground (APG), MD, to evaluate the performance of military unmanned ground vehicles (UGV) in traversing off-road terrain. The course is a subset of existing APG automotive test facilities specifically selected to provide challenges to the perception and navigation subsystems as well as to vehicular mobility. Portions of the course currently defined are described in detail, while the overall vision is described in general. Issues addressed in the implementation of this course are discussed, including ground truth and safety. An ideal field test facility is deScribed, and trade-offs in the APG implementation are explained. An actual test using the course to characterize an obstacle detection subsystem is also described. Keywords: Unmanned ground vehicle, vehicle testing, autonomous mobility
In this paper we describe a recently developed algorithm for computing least cost paths under turn angle constraints. If a graph representation of a two or three dimensional routing problem contains V vertices and El edges, then the new algorithm scales as O(IEI log V). This result is substantially better than O((EIIVI) algorithms for the more general problem of routing with turn penalties, which cannot be applied to large scale graphs. We also describe an enhancement to the new algorithm that dramatically improves the performance in practice. We provide empirical results showing that the new algorithm can substantially reduce the computation time required for constrained vehicle routing. This performance is sufficient to allow for the dynamic re-routing of vehicles in uncertain or changing environments. Keywords: Dijkstra's algorithm, least cost paths, range searching, routing, turn constraints.
This paper will discuss research conducted at the Naval Research Laboratory in the area of automated routing, advanced 3D displays and novel interface techniques for interacting with those displays. This research has culminated in the development of the strike optimized mission planing module (STOMPM). The STOMPM testbed incorporates new technologies/results in the aforementioned areas to address the deficiencies in current systems and advance the state of the art in military planing systems.
In this paper we describe the GROTTO visualization projects being carried out at the Naval Research Laboratory. GROTTO is a CAVE-like system, that is, a surround-screen, surround- sound, immersive virtual reality device. We have explored the GROTTO visualization in a variety of scientific areas including oceanography, meteorology, chemistry, biochemistry, computational fluid dynamics and space sciences. Research has emphasized the applications of GROTTO visualization for military, land and sea-based command and control. Examples include the visualization of ocean current models for the simulation and stud of mine drifting and, inside our computational steering project, the effects of electro-magnetic radiation on missile defense satellites. We discuss plans to apply this technology to decision support applications involving the deployment of autonomous vehicles into contaminated battlefield environments, fire fighter control and hostage rescue operations.