27 March 2015 Recognition of human activities using depth images of Kinect for biofied building
Author Affiliations +
Abstract
These days, various functions in the living spaces are needed because of an aging society, promotion of energy conservation, and diversification of lifestyles. To meet this requirement, we propose “Biofied Building”. The “Biofied Building” is the system learnt from living beings. The various information is accumulated in a database using small sensor agent robots as a key function of this system to control the living spaces. Among the various kinds of information about the living spaces, especially human activities can be triggers for lighting or air conditioning control. By doing so, customized space is possible. Human activities are divided into two groups, the activities consisting of single behavior and the activities consisting of multiple behaviors. For example, “standing up” or “sitting down” consists of a single behavior. These activities are accompanied by large motions. On the other hand “eating” consists of several behaviors, holding the chopsticks, catching the food, putting them in the mouth, and so on. These are continuous motions. Considering the characteristics of two types of human activities, we individually, use two methods, R transformation and variance. In this paper, we focus on the two different types of human activities, and propose the two methods of human activity recognition methods for construction of the database of living space for “Biofied Building”. Finally, we compare the results of both methods.
© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Ami Ogawa, Akira Mita, "Recognition of human activities using depth images of Kinect for biofied building", Proc. SPIE 9435, Sensors and Smart Structures Technologies for Civil, Mechanical, and Aerospace Systems 2015, 94351U (27 March 2015); doi: 10.1117/12.2084079; https://doi.org/10.1117/12.2084079
PROCEEDINGS
8 PAGES


SHARE
Back to Top