Automatic estimation of human activities is a topic widely studied. However the process becomes difficult when we
want to estimate activities from a video stream, because human activities are dynamic and complex. Furthermore, we
have to take into account the amount of information that images provide, since it makes the modelling and estimation
activities a hard work. In this paper we propose a method for activity estimation based on object behavior. Objects are
located in a delimited observation area and their handling is recorded with a video camera. Activity estimation can be
done automatically by analyzing the video sequences. The proposed method is called "signature recognition" because it
considers a space-time signature of the behaviour of objects that are used in particular activities (e.g. patients' care in a
healthcare environment for elder people with restricted mobility). A pulse is produced when an object appears in or
disappears of the observation area. This means there is a change from zero to one or vice versa. These changes are
produced by the identification of the objects with a bank of nonlinear correlation filters. Each object is processed
independently and produces its own pulses; hence we are able to recognize several objects with different patterns at the
same time. The method is applied to estimate three healthcare-related activities of elder people with restricted mobility.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.