NASA has recently completed a study for the preliminary definition of a teleoperated robotic device. The Flight Telerobotic Servicer (FTS) will be used to assist astronauts in many of the on-board tasks of assembly, maintenance, servicing and inspection of the Space Station. This paper makes an assessment of the role that Artificial Intelligence (AI) may have in furthering the automation capabilities of the FTS and, hence, extending the FTS capacity for growth and evolution. Relevant system engineering issues are identified, and an approach for insertion of AI technology is presented in terms of the NASA/NBS Standard Reference Model (NASREM) control architecture.
The Space Station Operations Task Force (SSOTF) was created by the NASA Office of Space Station to ensure that operations are given due consideration in Space Station planning efforts. Specifically, SSOTF was asked to produce a framework for operations which meets the major objectives of the Space Station Program. The Automation and Robotics Panel of the Consortium for Space/Terrestrial Automation and Robotics and the California Space Institute of the University of California was requested by NASA to form a team to review the applications of automation and robotics (A&R) to the Space Station Program from an operations and utilization perspective. This review was organized to establish that perspective. This report presents a summary of the major activities of the Panel. This paper discusses the organizing principles upon which the study was based. Also included are potential applications of A&R along with a strategy for evaluating their associated benefits and costs.
The ground-based demonstration of EVA Retriever, a voice-supervised, intelligent, free-flying robot, is designed to evaluate the capability to retrieve objects (astronauts, equipment, and tools) which have accidentally separated from the Space Station. The major objective of the EVA Retriever Project is to design, develop, and evaluate an integrated robotic hardware and on-board software system which autonomously: (1)performs system activation and check-out, (2)searches for and acquires the target, (3)plans and executes a rendezvous while continuously tracking the target, (4)avoids stationary and moving obstacles, (5)reaches for and grapples the target, (6)returns to transfer the object, and (7)returns to base.
Recent theoretical and experimental progress towards the achievement of an autonomous control system for spacecraft nuclear reactors is reported. A controller having a multitiered structure with the capability for both supervisory and predictive control as well as automated reasoning is proposed. Such a system would establish bounds for permitted operation (supervisory control), facilitate the inference of system response (predictive control), and provide a means for assessing its own performance (automated reasoning). Recent developments concerning specific features of this controller are reported including the use of energy constraints for supervisory control, the performance of automated reactor startups using the MIT-SNL Period-Generated Minimum Time Laws, and considerations relevant to the use of pattern recognition, expert systems, and causal analysis for the on-line assessment of controller performance. Also, an overview is given of the MIT program on reactor control.
The paper discusses a comprehensive, model-based approach for the design and implementation of intelligent controllers. The system has been implemented in the framework of the Multigraph Architecture. The Multigraph Architecture is a layered system, which includes a parallel, graph computation model, the corresponding execution environment, and software tools supporting the interactive, graphical building of knowledge-bases.
The United States Space Station will employ robotic systems in conjunction with crew-member Extravehicular Activity (EVA). The control methods and corresponding crew interfaces for these systems are currently in development. Both teleoperation and autonomous operation are being pursued to provide either low-level control or high-level supervision of robotic tasks. The Flight Telerobotic Servicer (FTS) will be launched during the Station assembly process and will be teleoperated to perform a variety of assembly, maintenance, and servicing tasks. The EVA Retriever is a free-flying autonomous robot designed for retrieval of a drifting crewmember or piece of equipment inadvertently detached from the Station. These two robotic systems exemplify the choices which must be made in designing the robot control method. Teleoperation and autonomy are the ends of a spectrum of possible control modes. In choosing a design point along this dimension, the complexity of the robotic task must be considered along with the technologies required to support either teleoperation or autonomous performance of the task. Requirements of the crew operators and the workloads to be imposed on them must be weighed during selection and design of the control method. Safety considerations will also constrain the design. Space Station operations will be enhanced by optimization of each robot's control method with respect to its mission.
A major consideration in the design of trajectory generation software for a Flight Telerobotic Servicer (FTS) is that the FTS will be called upon to perform tasks which require a diverse range of manipulator behaviors and capabilities. In a hierarchical control system where tasks are decomposed into simpler and simpler subtasks, the task decomposition module which performs trajectory planning and execution should therefore be able to accommodate a wide range of algorithms. In some cases, it will be desirable to plan a trajectory for an entire motion before manipulator motion commences, as when optimizing over the entire trajectory. Many FTS motions, however, will be highly sensory-interactive, such as moving to attain a desired position relative to a non-stationary object whose position is periodically updated by a vision system. In this case, the time-varying nature of the trajectory may be handled either by frequent replanning using updated sensor information, or by using an algorithm which creates a less specific state-dependent plan that determines the manipulator path as the trajectory is executed (rather than a priori). This paper discusses a number of trajectory generation techniques from these categories and how they may be implemented in a task decompo-sition module of a hierarchical control system. The structure, function, and interfaces of the proposed trajectory gener-ation module are briefly described, followed by several examples of how different algorithms may be performed by the module. The proposed task decomposition module provides a logical structure for trajectory planning and execution, and supports a large number of published trajectory generation techniques.
The problem of practical, space based supervisory control of a manipulator presents special problems beyond those encountered by terrestrial industrial robot systems. This is evident by examining the goals and concerns surrounding the development of intelligent robotics software for the Service Manipulator System (SMS) being developed for the European Space Agency (ESA). While some problems important in terrestrial robotics applications are simplified by the highly engineered nature of the proposed service tasks, other problems introduce new complications. Reliability and testability requirements have a major impact. Uncertainties introduced in component geometries by the stress of launch and deployment are also important. These problems are currently being explored through software experiments and the development of an intelligent robotics testbed.
NASA has committed to the design and implementation of a Space Station Flight Telerobotic Servicer (FTS) to assist the astronauts in assembly, maintenance, servicing, and inspection tasks on the Space Station and the Space Shuttle. One of the requirements of the FTS is safety. Safety is not solely dependent on the visible hardware components such as manipulators and hydraulic systems. It is also dependent on the underlying software which controls every action of these hardware components. An acceptable level of safety can only be reached by analyzing and implementing safety issues through the conceptualization, design, construction, and operation phases of the FTS. This article discusses three issues that are critical to the FTS safety. These include software design philosophy, software operating models, and a safety subsystem.
NASA's Edwin C. Hubble Space Telescope (HST) is a complex, modern satellite system that presents several opportunities to apply knowledge based technology. An intelligent operator aid, the Telemetry Analysis Logic for Orbiting Spacecraft (TALOS), is being built using the Lockheed Satellite Telemetry Analysis in Real-time (L*STAR) expert system tool to support real-time, ground based health and safety monitoring. Lockheed is also building applications to demonstrate how design knowledge captured in an HST Design Engineering Knowledge-base (HSTDEK) can be reused in various engineering activities. The HST Operational Readiness Expert (HSTORE), currently under development, will use captured design knowledge to support the evaluation of the orbital readiness of the Space Telescope.
The man-robot symbiosis concept has the fundamental objective of bridging the gap between fully human-controlled and fully autonomous systems to achieve true man-robot cooperative control and intelligence. Such a system would allow improved speed, accuracy, and efficiency of task execution, while retaining the man in the loop for innovative reasoning and decision-making. The symbiont would have capabilities for supervised and unsupervised learning, allowing an increase of expertise in a wide task domain. This paper describes a robotic system architecture facilitating the symbiotic integration of teleoperative and automated modes of task execution. The architecture reflects a unique blend of many disciplines of artificial intelligence into a working system, including job or mission planning, dynamic task allocation, man-robot communication, automated monitoring, and machine learning. These disciplines are embodied in five major components of the symbiotic framework: the Job Planner, the Dynamic Task Allocator, the Presenter/Interpreter, the Automated Monitor, and the Learning System.
Space applications can be greatly enhanced by the use of robotics and automation in activities such as orbital inspection and maintenance. Partial autonomy of space systems will enhance safety, reliability, productivity, adaptability, and reduction of overall cost. At the core of a robotics system is the ability to acquire, integrate, and interpret multisensory data to generate appropriate actions to perform a given task. In this paper, we show the feasibility of realistic autonomous space manipulation tasks using multisensory information. This is shown through two experiments involving a Fluid Interchange System and a Module Interchange System. In both cases, autonomous location of the mating element, autonomous location of a guiding light target, mating, and demating of the system are performed. Using the vision, force/torque, proxim-ity, and touch sensors, the Fluid Interchange System and the Module Interchange System experiments were accomplished autonomously and successfully.
The GE Advanced Technology Laboratories has developed an interactive graphic overlay system that employs closed circuit television (CCTV) cameras to facilitate the remote verification and maintenance of a geometric world model database. This database provides geometric models and locations of objects viewed by CCTV cameras and manipulated by telerobots. To update the database, an operator uses GE's interactive graphics system to superimpose a wireframe line drawing of an object with known dimensions on a live video scene containing that object. The technique used to align the wireframe and the video image is multipoint positioning. An important application of GE's interactive graphic overlay system is as an object designation function for the operator control station (OCS) of the Jet Propulsion Laboratory's Tel-erobot Demonstration System.
The Advanced Integrated Maintenance System (AIMS) is a remote maintenance system that uses a dual-arm teleoperator to control the Advanced Servomanipulator (ASM). Although, the ASM was designed for nuclear fuel reprocessing, it can provide a testbed for space telerobotics (space servicing and assembly). The objective of this project is to design and implement a generalized capability for automatic path-planning and obstacle avoidance during motion of the manipulator transporter (overhead crane and main manipulator shaft) from one work location in the hotcell to another. The work is to be accomplished in two phases. In the first phase, the transporter will move in a known world. In the second phase, sensor data will be used to update and verify the world model and the system will have the capability to avoid unexpected obstacles. This paper describes the design and initial development of a system for phase one. The phase one system will have three components: operator interface, navigation code, and hardware interface. The operator interface will be the most complex part of the system and will have three components: display, goal editor, and geometry editor. The display will allow the AIMS operator to view a three dimensional representation of the location of the ASM in the hotcell. The goal editor will allow the operator to define a goal for the ASM. The geometry editor will display the geometry of the hotcell and allow the operator to add or remove objects from the geometry. The navigation code will explore the geometry and define a clear path from the current position to the goal. The hardware interface will monitor the position of the ASM and send signals to the existing transporter control system and to the display.
In 1985, the National Aeronautics and Space Administration (NASA) initiated a major program of robotics research and development for technology applications to space servicing, assembly, repair, and remote exploration. A focal point within this program has been the development of a ground-based telerobot testbed at the Jet Propulsion Laboratory in Pasasdena, California. Designed to prove technology concepts for supervised automation of increasingly unstructured and complex tasks, the testbed has reached an initial stage of integration. We have performed and report on several significant testbed experiments which include: visual tracking and grapple of a satellite, dual arm spatial coordination and manipulator control, force-reflecting teleoperations, and simulated task planning for a satellite servicing scenario. We will also outline the current NASA plans for continuing testbed development and demonstration.
The NASA Center for Autonomous and Man-Controlled Robotics and Sensing Systems, CAMRSS is developing a concept called teleperception which facilitates inspace activity and addresses the human factors related issues of information overload. A basic tenet of the concept is that the lines of distinction between computer perception and human perception need not be absolute. Teleperception is the technology of man-machine interaction which permits the augmentation of machine perception techniques with the con-siderable intangibilities of human cognition and which exploits the facility of machine perception to handle vast amounts of data to distill and enhance information for selective presentation to human agents. In this paper, we shall present both the concept of teleperception and the projects undertaken at the CAMRSS laboratories which embodies the concept. In the next section, tele-perception will be more rigorously defined as a conceptual framework. The ensuing sections will detail the merits of the concept; describe the environment being developed at our laboratories which instantiates the concept; and overview the teleperception research being carried out at CAMRSS.
An edge detection algorithm for use with tactile sensors is presented in this paper. The algorithm is based on the physical properties of the tactile sensor and tactile data. In addition, the algorithm is computationally efficient, and is thus suitable for real-time data processing. Experimental results from the application of this algorithm to a Lord LTS-210 tactile array sensor are presented. Further, observations about the use of a tactile sensor in a system are discussed. The proposed algorithm is a part of a real-time controller, implemented on CMU DD Arm II, that uses a tactile sensor in the feedback loop.
A fully autonomous system which has rich interaction with the world will not be realized in the near future. Systems with autonomous capability will require use of knowledge external to themselves. Even human beings frequently use references and ask for advice. An interface between a partially autonomous system and external sources of knowledge is a feature which enables application of technology not yet fully autonomous. This is the strategy taken in the development of the Telerobot Interactive Planning System.
The Adaptive Model Following Control (AMFC) method is used in the design of a manipulator controller to take care of variations in payload and spatial configuration and some of the effects of unmodeled dynamics. The paper addresses the real-time implementation of the adaptive controller of PUMA 560 manipulator. The experimental results have shown that the manipulator closely follows the behavior of the reference model regardless of the load it is carrying.
The control of manipulators is seen here as a two level process. A method is described to convert information available at the programming level into trajectories suitable to be tracked by a servo control system. The goal of the servo is to absorb the unmodeled dynamics. Tracking accuracy will depend mainly on the acceleration demand of the nominal trajectory setpoint, in particular, the actuator output demand must remain bounded. Our scheme adaptively takes into consideration at the trajectory computation level the dynamics of the underlying system, dynamically available information acquired through sensors, various types of constraints, such as path accuracy, and manipulator optimization. This scheme is meant to be implemented on-line, to drive mechanical systems such as manipulators. It is developed in the context of a multi-manipulator programming and control environment for space applications developped as part of a collaborative effort between McGill University and the Jet Propulsion Laboratory.
A recently developed spatial operator algebra for modeling, control and trajectory design of manipulators is discussed. The elements of this algebra are linear operators whose domain and range spaces consist of forces, moments, velocities, and accelerations. The operators themselves are elements in the algebra of linear bounded operators. The effect of these operators when operating on elements in the domain is equivalent to a spatial recursion along the span of a manipulator. Inversion of operators can be efficiently obtained via techniques of spatially recursive filtering and smoothing. The operator algebra provides a high-level framework for describing the dynamic and kinematic behavior of a manipulator and for the corresponding control and trajectory design algorithms. Expressions interpreted within the algebraic framework lead to enhanced conceptual and physical understanding of manipulator dynamics and kinematics. Furthermore, implementable recursive algorithms can be immediately derived from the abstract operator expressions by inspection. Thus, the transition from an abstract problem formulation and solution to the detailed mechanization of specific algorithms is greatly simplified. This paper discusses the analytical formulation of the operator algebra, as well as its implementation in the Ada programming language.
The OAST Systems Autonomy Demonstration Project (SADP) has already served as a catalyst for the incorporation of advanced automation in the design of NASA's Space Station. This is evident in the various Space Station Prime Contractor proposals for both the Initial Operating Configuration (IOC) and the growth, or evolutionary Station. Due to the nature of the 1988 and 1990 demonstrations, the application of advanced automation to the operation of an individual system has been the major emphasis in the Prime Contractors' proposals. This paper addresses the application of advanced automation to the Space Station's Operations Management System, the master controlling mechanism for Space Station operations. The OMS functional requirements and design are described and a review of desired OMS evolutionary capabilities is presented. The SADP's 1993 and 1996 demonstration goals are discussed with respect to these requirements and desired evolutionary capabilities. Major technical challenges facing OMS designers are identified in order to guide the definition of detailed objectives for the future SADP demonstrations. By defining future objectives which address the OMS challenges, the SADP will maintain its role as a catalyst for future Space Station advanced automation.
The Office of Aeronautics and Space Technology has selected the Space Station Electrical Power System as one of the systems that will participate in the Systems Autonomy Demonstration Project(SADP) 1990 Power/Thermal Demonstration. The purpose of this demonstration is the autonomous operation of two major Space Station systems through the application of cooperating knowledge-based systems technology. Lewis Research Center(LeRC) and Marshall Space Flight Center(MSFC) will first jointly develop an autonomous power system using existing Space Station testbed facilities at each center. The subsequent 1990 power-thermal demonstrationwill then involve the cooperative operation of the LeRC/MSFC power system with the Johnson Space Center(JSC)'s thermal control and DMS/OMS testbed facilities. The testbeds and expert systems at each of the NASA centers will be interconnected via communication links. The appropriate knowledge-based technology will be developed for each testbed and applied to problems requiring inter-system cooperation. Primary emphasis will be focused on failure detection and classification, system reconfiguration, planning and scheduling of electrical power resources and integration of knowledge-based and conventional control system software into the design and operation of Space Station testbeds.
The use of an optimization technique known as a genetic algorithm for solving the mobile transporter path planning problem is investigated. The mobile transporter is a traveling robotic vehicle proposed for the space station which must be able to reach any point of the structure autonomously. Specific elements of the genetic algorithm are explored in both a theoretical and experimental sense. Recent developments in genetic algorithm theory are shown to be particularly effective in a path planning problem domain, though problem areas can be cited which require more research. However, trajectory planning problems are common in space systems and the genetic algorithm provides an attractive alternative to the classical techniques used to solve these problems.