The research program aims to explore and examine the fine balance necessary for maintaining the interplays between
technology and the immersant, including identifying qualities that contribute to creating and maintaining a sense of
"presence" and "immersion" in an immersive virtual reality (IVR) experience. Building upon and extending previous
work, we compare sitting meditation with walking meditation in a virtual environment (VE). The Virtual Meditative
Walk, a new work-in-progress, integrates VR and biofeedback technologies with a self-directed, uni-directional
treadmill. As immersants learn how to meditate while walking, robust, real-time biofeedback technology continuously
measures breathing, skin conductance and heart rate. The physiological states of the immersant will in turn affect the
audio and stereoscopic visual media through shutter glasses. We plan to test the potential benefits and limitations of this
physically active form of meditation with data from a sitting form of meditation. A mixed-methods approach to testing
user outcomes parallels the knowledge bases of the collaborative team: a physician, computer scientists and artists.
Neuroscience has benefited from an explosion of new experimental techniques; many have only become feasible in the wake of improvements in computing speed and data storage. At the same time, these new computation-intensive techniques have led to a growing gulf between the data and the knowledge extracted from those data. That is, in the neurosciences there is a paucity of effective knowledge management techniques and an accelerating accumulation of
experimental data. The purpose of the project described in the present paper is to create a visualization of the knowledge
base of the neurosciences. At run-time, this 'BrainFrame' project accesses several web-based ontologies and generates a
semantically zoomable representation of any one of many levels of the human nervous system.