Optical interferometry has been shown to be a viable method for making high-precision astrometric measurements, with the capability for unprecedented accuracy in both narrow- and wide-angle regimes.' As astrometric resolution increases, however, the contribution of certain systematic internal errors of the instrument itself to the measured optical delay can become significant compared to the contributions due to angular position variations on the scale of 5-10 miliiarcseconds. In particular, the baseline of the interferometer can no longer be regarded as a fixed quantity at scales below about 1 micron. On the Mark III interferofneter at Mt. Wilson2, the major moving parts, aside from the optical delay lines, are the siderostats which mark the endpoints of the baseline. Non-ideal motions of the siderostat mirrors can cause changes in the instrument's baseline at the 1 micron level. To compensate for this motion, a prototype laser metrology system in the form of an optical tripod has been installed to measure changes in the instrument's baseline and supply corrections to the optical delay and delay offset data. This system has revealed a number of issues that are crucial to the development of future systems. In particular, future space-based systems3 will require laser metrology systems with accuracies of the order of 0.01-0. 1 nanometers, and the implications of the Mt. Wilson results on future systems are discussed, and current thinking on future designs is presented.