This paper describes the guidance function of an autonomous vehicle based on a neural network controller using video images with adaptive view angles for sensory input. The guidance function for an autonomous vehicle provides the low-level control required for maintaining the autonomous vehicle on a prescribed trajectory. Neural networks possess unique properties such as the ability to perform sensor fusion, the ability to learn, and fault tolerant architectures, qualities which are desirable for autonomous vehicle applications. To demonstrate the feasibility of using neural networks in this type of an application, an Intelledex 405 robot fitted with a video camera and vision system was used to model an autonomous vehicle with a limited range of motion. In addition to fixed-angle video images, a set of images using adaptively varied view angles based on speed are used as the input to the neural network controller. It was shown that the neural network was able to control the autonomous vehicle model along a path composed of path segments unlike the exemplars with which it was trained. This system was designed to assess only the guidance system, and it was assumed that other functions employed in autonomous vehicle control systems (mission planning, navigation, and obstacle avoidance) are to be implemented separately and are providing a desired path to the guidance system. The desired path trajectory is presented to the robot in the form of a two-dimensional path, with centerline, that is to be followed. A video camera and associated vision system provides video image data as control feedback to the guidance system. The neural network controller uses Gaussian curves for the output vector to facilitate interpolation and generalization of the output space.