Performance standards for industrial mobile robots and mobile manipulators (robot arms onboard mobile robots) have only recently begun development. Low cost and standardized measurement techniques are needed to characterize system performance, compare different systems, and to determine if recalibration is required. This paper discusses work at the National Institute of Standards and Technology (NIST) and within the ASTM Committee F45 on Driverless Automatic Guided Industrial Vehicles. This includes standards for both terminology, F45.91, and for navigation performance test methods, F45.02. The paper defines terms that are being considered. Additionally, the paper describes navigation test methods that are near ballot and docking test methods being designed for consideration within F45.02. This includes the use of low cost artifacts that can provide alternatives to using relatively expensive measurement systems.
This paper describes a concept for measuring the reproducible performance of mobile manipulators to be used for assembly or other similar tasks. An automatic guided vehicle with an onboard robot arm was programmed to repeatedly move to and stop at a novel, reconfigurable mobile manipulator artifact (RMMA), sense the RMMA, and detect targets on the RMMA. The manipulator moved a laser retroreflective sensor to detect small reflectors that can be reconfigured to measure various manipulator positions and orientations (poses). This paper describes calibration of a multi-camera, motion capture system using a 6 degree-of-freedom metrology bar and then using the camera system as a ground truth measurement device for validation of the reproducible mobile manipulator’s experiments and test method. Static performance measurement of a mobile manipulator using the RMMA has proved useful for relatively high tolerance pose estimation and other metrics that support standard test method development for indexed and dynamic mobile manipulator applications.
Future smart manufacturing systems will include more complex coordination of mobile manipulators (i.e., robot arms
mounted on mobile bases). The National Institute of Standards and Technology (NIST) conducts research on the safety
and performance of multiple collaborating robots using a mobile platform, an automatic guided vehicle (AGV) with an
onboard manipulator. Safety standards for robots and industrial vehicles each mandate their failsafe control, but there is
little overlap between the standards that can be relied on when the two systems are combined and their independent
controllers make collaborative decisions for safe movement. This paper briefly discusses previously uncovered gaps
between AGV and manipulator standards and details decision sharing for when manipulators and AGVs are combined
into a collaborative, mobile manipulator system. Tests using the NIST mobile manipulator with various control methods
were performed and are described along with test results and plans for further, more complex tests of implicit and
explicit coordination control of the mobile manipulator.
The National Institute of Standards and Technology (NIST) has been researching human-robot-vehicle collaborative
environments for automated guided vehicles (AGVs) and manned forklifts. Safety of AGVs and manned vehicles with
automated functions (e.g., forklifts that slow/stop automatically in hazardous situations) are the focus of the American
National Standards Institute/Industrial Truck Safety Development Foundation (ANSI/ITSDF) B56.5 safety standard.
Recently, the NIST Mobile Autonomous Vehicle Obstacle Detection/Avoidance (MAVODA) Project began researching
test methods to detect humans or other obstacles entering the vehicle’s path. This causes potential safety hazards in
manufacturing facilities where both line-of-sight and non-line-of-sight conditions are prevalent. The test methods
described in this paper address both of these conditions. These methods will provide the B56.5 committee with the
measurement science basis for sensing systems - both non-contact and contact - that may be used in manufacturing
The National Institute of Standards and Technology (NIST) has been studying pallet visualization for the automated guided vehicle (AGV) industry. Through a cooperative research and development agreement with Transbotics, an AGV manufacturer, NIST has developed advanced sensor processing and world modeling algorithms to verify pallet location and orientation with respect to the AGV. Sensor processing utilizes two onboard AGV, single scan-line, laser-range units. The "Safety" sensor is a safety unit located at the base of a forktruck AGV and the "Panner" sensor is a panning laser-ranger rotated 90 degrees, mounted on a rotating motor, and mounted at the top, front of the AGV. The Safety sensor, typically used to detect obstacles such as humans, was also used to detect pallets and their surrounding area such as the walls of a truck to be loaded with pallets. The Panner, was used to acquire many scan-lines of range data which was processed into a 3D point cloud and segment out the pallet by a priori, approximate pallet load or remaining truck volumes. A world model was then constructed and output to the vehicle for pallet/truck volume verification. This paper will explain this joint government/industry project and results of using LADAR imaging methods.
As unmanned ground vehicles take on more and more intelligent tasks, determination of potential obstacles and accurate estimation of their position become critical for successful navigation and path planning. The performance analysis of obstacle mapping and unmanned vehicle positioning in outdoor environments is the subject of this paper. Recently, the National Institute of Standards and Technology's (NIST) Intelligent Systems Division has been a part of the Defense Advanced Research Project Agency LAGR (Learning Applied to Ground Robots) Program. NIST's objective for the LAGR Project is to insert learning algorithms into the modules that make up the NIST 4D/RCS (Four Dimensional/Real-Time Control System) standard reference model architecture which has been successfully applied to many intelligent systems. We detail world modeling techniques used in the 4D/RCS architecture and then analyze the high precision maps generated by the vehicle world modeling algorithms as compared to ground truth obtained from an independent differential GPS system operable throughout most of the NIST campus. This work has implications, not only for outdoor vehicles but also, for indoor automated guided vehicles where future systems will have more and more onboard intelligence requiring non-contact sensors to provide accurate vehicle and object positioning.
The performance evaluation of an obstacle detection and segmentation algorithm for Automated Guided Vehicle (AGV) navigation in factory-like environments using a new 3D real-time range camera is the subject of this paper. Our approach expands on the US ASME B56.5 Safety Standard, which now allows for non-contact safety sensors, by performing tests on objects specifically sized in both the US and the British Safety Standards. These successful tests placed the recommended, as well as smaller, material-covered and sized objects on the vehicle path for static measurement. The segmented (mapped) obstacles were then verified in range to the objects and object size using simultaneous, absolute measurements obtained using a relatively accurate 2D scanning laser rangefinder. These 3D range cameras are expected to be relatively inexpensive and used indoors and possibly used outdoors for a vast amount of mobile robot applications building on experimental results explained in this paper.