Trends in combat technology research point to an increasing role for uninhabited vehicles in modern warfare tactics. To support increased span of control over these vehicles human responsibilities need to be transformed from tedious, error-prone and cognition intensive operations into tasks that are more supervisory and manageable, even under intensely stressful conditions. The goal is to move away from only supporting human command of low-level system functions to intention-level human-system dialogue about the operator's tasks and situation.
A critical element of this process is developing the means to identify when human operators need automated assistance and to identify what assistance they need. Toward this goal, we are developing an unmanned vehicle operator task recognition system that combines work in human behavior modeling and Bayesian plan recognition. Traditionally, human behavior models have been considered generative, meaning they describe all possible valid behaviors. Basing behavior recognition on models designed for behavior generation can offers advantages in improved model fidelity and reuse. It is not clear, however, how to reconcile the structural differences between behavior recognition and behavior modeling approaches.
Our current work demonstrates that by pairing a cognitive psychology derived human behavior modeling approach, GOMS, with a Bayesian plan recognition engine, ASPRN, we can translate a behavior generation model into a recognition model. We will discuss the implications for using human performance models in this manner as well as suggest how this kind of modeling may be used to support the real-time control of multiple, uninhabited battlefield vehicles and other semi-autonomous systems.
This paper describes a system for implementing adjustable autonomy levels in simulated unmanned vehicles using an approach based upon the fields of deontics and Joint Intention Theory (JIT). It discusses Soar Technology's Intelligent Control Framework research project (ICF), the authors' use of deontics in the creation of adjustable autonomy for ICF, and some possible future directions in which the research could be expanded. Use of deontics and JIT in ICF has allowed us to define system-wide formal limits on the behavior of the unmanned systems controlled by ICF, to increase the flexibility of our adjustable autonomy system, and to decrease the granularity of the autonomy adjustments. This set of formalisms allows the unmanned system maximal autonomy in the default case, while allowing the user and supervisory agents to constrain that autonomy in situations when necessary. Unlike more strictly layered adjustable autonomy formalisms, our adjustable autonomy formalism can be used to restrict subsets of autonomous behaviors, rather than entire systems, in response to situational requirements.
Trends in combat technology research point to an increasing role for uninhabited vehicles and other robotic elements in modern warfare tactics. However, real-time control of multiple uninhabited battlefield robots and other semi-autonomous systems, in diverse fields of operation, is a difficult problem for modern warfighters that, while identified, has not been adequately addressed.
Soar Technology is applying software agent technology to simplify demands on the human operator. Our goal is to build intelligent systems capable of finding the best balance of control between the human and autonomous system capabilities. We are developing an Intelligent Control Framework (ICF) from which to create agent-based systems that are able to dynamically delegate responsibilities across multiple robotic assets and the human operator. This paper describes proposed changes to our ICF architecture based on principles of human-machine teamwork derived from collaborative discourse theory. We outline the principles and the new architecture, and give examples of the benefits that can be realized from our approach.
Currently, multiple humans are needed to operate a single uninhabited aerial vehicle (UAV). In the near future, combat techniques will involve single operators controlling multiple uninhabited ground and air vehicles. This situation creates both technological hurdles as well as interaction design challenges that must be addressed to support future fighters. In particular, the system will need to negotiate with the operator about proper task delegation, keeping the operator appropriately apprised of autonomous actions. This in turn implies that the system must know what the user is doing, what needs to be done in the present situation, and the comparative strengths for of the human and the system in each task. Towards building such systems, we are working on an Intelligent Control Framework (ICF) that provides a layer of intelligence to support future warfighters in complex task environments. The present paper presents the Adjustable Autonomy Module (AAM) in ICF. The AAM encapsulates some capabilities for user plan recognition, situation reasoning, and authority delegation control. The AAM has the knowledge necessary to support operator-system dialogue about autonomy changes, and it also provides the system with the ability to act on this knowledge. Combined with careful interaction design, planning and plan-execution capabilities, the AAM enables future design and development of effective human-robot teams.