Many important applications of autonomous underwater vehicles (AUVs) require operations in close proximity to man-made objects or natural bottom topography. In these situations, the vehicle must adapt its trajectory on-line in response to current threats and mission objectives. To provide this capability, we are developing a sonar-based navigation technique that emulates the manner in which a person navigates through an unknown room in the dark: by reaching out for and establishing contact with walls, tables, and chairs, managing transitions from one object to the next as one moves across the room. Our intuition here is that, in many ways, sonar is more like touch than vision. It may be possible to build a vehicle that can effectively use its sonar to `grab' an object of interest, say a cylindrical post for docking, and then `reel itself in' by feeding back sonar range measurements from the object to its dynamic controller. We envision an AUV that can establish `virtual tethers' with arbitrary objects in the water column or on the seabed. Fast, local processing can maintain `contact' with the objects or surfaces of interest. Control laws can be established to utilize streams of measurements from these features to achieve local, feature-relative navigation. While our research is driven by the severe challenges of the subsea environment, we anticipate that the approach will also be useful in land robot applications.