Conventional two-channel stereoscopic 3D displays fall short of duplicating real-world viewing in a number of ways. For example, although an observer's convergence changes when looking at objects at varying distances in a 3D display, the ocular focus required remains constant, due to the fixed optical distance of the display's images. This can produce an undesirable mismatch between focus and fixation. Further, if the observer's head moves up/down, left/right, fore/aft, or tilts, the two retinal images do not transform as they normally would in direct viewing of the scene. The degree of "remote presence" or "telepresence" achieved by these systems is related to how well the display duplicates the retinal images and visual-motor feedback that would be experienced if directly viewing the remote scene. This paper describes two real-time head-coupled remote telepresence display concepts: a helmet-mounted display, and a "virtual window" display. These can, in principle, not only provide vertical, horizontal, and longitudinal motion parallax linked to changing observer position, but can also recreate the normal visual-motor relationship between focus and convergence in the observer's eyes. Existing hardware systems are discussed as an available means for implementing demonstration prototypes for these concepts.