This paper presents a report on the WPI autonomous mobile robot (WAMR). This robot is currently under development by the Intelligent Machines Project at WPI. Its purpose is to serve as a testbed for real-time artificial intelligence. WAMR is expected to find its way from one place in a building to another, avoiding people and obstacles enroute. It is given no a priori knowledge of the building, but must learn about its environment by goal-directed exploration. Design concepts and descriptions of the major items completed thus far are presented. WAMR is a self-contained, wheeled robot that uses evidence based techniques to reason about actions. The robot builds and continually updates a world model of its environment. This is done using a combination of ultrasonic and visual data. This world model is interpreted and movement plans are generated by a planner utilizing uses real-time incremental evidence techniques. These movement plans are then carried out by a hierarchical evidence-based adaptive controller. Two interesting features of the robot are the line imaging ultrasonic sensor and the video subsystem. The former uses frequency variation to form a line image of obstacles between one and twenty feet in front of the robot. The latter attempts to mimic the human eye using neural network pattern recognition techniques. Several items have been completed thus far. The paper describes some of these, including the multiprocessor navigator and non-skid motion control system, the ultrasonic line imager, the concepts of the vision system, and the computer hardware and software environment.