We are witnessing an explosion of new forms of Human Computer Interaction devices lately for both laboratory research and home use. With these new affordance in user interfaces (UI), how can gestures be used to improve
interaction for large scale immersive display environments. Through the investigation of full body, head and hand
tracking, this paper will discuss various modalities of gesture recognition and compare their usability to other forms of interactivity. We will explore a specific implementation of hand gesture tracking within a large tiled display environment for use with common collaborative media interaction activities.