Users of immersive virtual reality environments have reported a wide variety of side and after effects including the
confusion of characteristics of the real and virtual worlds. Perhaps this side effect of confusing the virtual and real
can be turned around to explore the possibilities for immersion with minimal technological support in virtual world
group training simulations. This paper will describe observations from my time working as an artist/researcher with the
UCSD School of Medicine (SoM) and Veterans Administration San Diego Healthcare System (VASDHS) to develop
trainings for nurses, doctors and Hospital Incident Command staff that simulate pandemic virus outbreaks. By examining
moments of slippage between realities, both into and out of the virtual environment, moments of the confusion of
boundaries between real and virtual, we can better understand methods for creating immersion. I will use the mixing of
realities as a transversal line of inquiry, borrowing from virtual reality studies, game studies, and anthropological studies
to better understand the mechanisms of immersion in virtual worlds. Focusing on drills conducted in Second Life, I will
examine moments of training to learn the software interface, moments within the drill and interviews after the drill.
The goal for Becoming Dragon was to develop a working, immersive Mixed Reality system by using a motion capture
system and head mounted display to control a character in Second Life - a Massively Multiplayer Online 3D
environment - in order to examine a number of questions regarding identity, gender and the transformative potential of
technology. This performance was accomplished through a collaboration between Micha Cardenas, the performer and
technical director, Christopher Head, Kael Greco, Benjamin Lotan, Anna Storelli and Elle Mehrmand.
The plan for this project was to model the performer's physical environment to enable them to live in the virtual
environment for extended amounts of time, using an approach of Mixed Reality, where the physical world is mapped
into the virtual. I remain critical of the concept of Mixed Reality, as it presents an idea of realities as totalities and as
objective essences independent of interpretation through the symbolic order. Part of my goal with this project is to
explore identity as a process of social feedback, in the sense that Donna Haraway describes "becoming with"<sup>iii</sup>, as well as
to explore the concept of Reality Spectrum that Augmentology.com discusses, thinking about states such as AFK (Away
From Keyboard) that are in-between virtual and corporeal presence.<sup>iv</sup> Both of these ideas are ways of overcoming the
dualisms of mind/body, real/virtual and self/other that have been a problematic part of thinking about technology for so
long. Towards thinking beyond these binaries, Anna Munster offers a concept of enfolding the body and technologyv,
building on Gilles Deleuze's notion of the baroque fold. She says "the superfold... opens up for us a twisted topology of
code folding back upon itself without determinate start or end points: we now live in a time and space in which body and
information are thoroughly imbricated."<sup>vi</sup> She elaborates on this notion of body and code as becoming with each other
saying "the incorporeal vectors of digital information draw out the capacities of our bodies to become other than matter
conceived as a mere vessel for consciousness or a substrate for signal... we may also conceive of these experiences as a
new territory made possible by the fact that our bodies are immanently open to these kinds of technically symbiotic
transformations"<sup>vii</sup>. A number of the technologies used in this performance were used in an attempt to blur the line
between the actual and the digital, such as motion capture, live video streaming into Second Life and 3D fabrication of
physical copies of Second Life avatars.
The performance was developed using the following components:
- An Emagin Z800 immersive head mounted display (HMD) allowed the performer to move around in the
physical environment within Calit2 and still remain "in game". Head tracking and stereoscopic imagery help to
provide a realistic feeling of immersion. We built on the University of Michigan 3D (UM3D) lab's stereoscopic
patch for the Second Life client, updating it to work with the latest version of Second Life.
- A motion tracking system. A Vicon MX40+ motion capture system was installed into the Visiting Artist Lab at
CRCA, which served as the physical performance space, to allow real-time motion tracking data to be sent to a
PC running Windows. Using this data, the plan was to map the physical motion in the real world back into
game space, so that, for example, the performer could easily get to their food source or to the restroom. We
developed a C++ bridge that includes a parser for the Vicon real time data stream in order to communicate this
to the Second Life server to produce changes in avatar and object positions based on real physical movement.
The goal was to get complete body gestures into Second Life in near real time.
- A Puredata patch called Lila, developed by Shahrokh Yadegadi of UCSD, which was used to modulate the
performer's voice, to provide a voice system that allowed chat ability in Second Life, which was less gendered
and less human.