A teleoperated microsurgical robot has been developed together with a virtual environment for microsurgery on the eye. Visual and mechanical information is relayed via bidirectional pathways between the slave and master of the microsurgical robot. The system permits surgeons to operate in one of three alternative modes: on real tissue, on physically simulated tissue in a mannequin, or on a computer based physical model contained within the ophthalmic virtual environment. In all three modalities, forces generated during tissue manipulation (i.e. resecting, probing) are fed back to the surgeon via a force reflecting interface to give the haptic sensations (i.e. `feel') appropriate to the actions being performed. The microsurgical robot has been designed so that the master and slave systems can be in physically separate environments which permits remote surgery to be performed. The system attempts to create an immersive environment for the operator by including not only visual and haptic feedback, but also auditory, cutaneous, and, ultimately, olfactory sensations.