This work outlines a system in which a stereo camera may effectively track a user's face and hands in three dimensions.
Given this information, a method for controlling objects in three dimensions is also described. The system begins by
finding faces. If more than one face is found in the image, the algorithm uses depth information to isolate the face that is
closest to the camera. The algorithm then gathers information about the user's skin tone by examining the content of the
face found. For much of the processing, only the hue and saturation components are used after applying an HSV to RGB
transformation given the camera output. The skin tone information in tandem with depth is then used to isolate the user's
hands, and track them in three dimensions. To be used as an effective interface, the system uses information of the two
hands relative to the user's face. In controlling an object in three dimensions, if the user would like to move the object
up, he or she simply positions both hands above his or her face. Similar commands allow the user to apply a translational
factor in three dimensions, as well as applying yaw and roll when wanted.
This work describes the process of developing a 3D Virtual Reality (VR) DJ simulation game intended to be displayed
on a stereoscopic display. Using a DLP projector and shutter glasses, the user of the system plays a game in which he or
she is a DJ in a night club. The night club's music is playing, and the DJ is "scratching" in correspondence to this music.
Much in the flavor of Guitar Hero or Dance Dance Revolution, a virtual turntable is manipulated to project information
about how the user should perform. The user only needs a small set of hand gestures, corresponding to the turntable
scratch movements to play the game. As the music plays, a series of moving arrows approaching the DJ's turntable
instruct the user as to when and how to perform the scratches.