16 January 2006 Practical life log video indexing based on content and context
Author Affiliations +
Abstract
Today, multimedia information has gained an important role in daily life and people can use imaging devices to capture their visual experiences. In this paper, we present our personal Life Log system to record personal experiences in form of wearable video and environmental data; in addition, an efficient retrieval system is demonstrated to recall the desirable media. We summarize the practical video indexing techniques based on Life Log content and context to detect talking scenes by using audio/visual cues and semantic key frames from GPS data. Voice annotation is also demonstrated as a practical indexing method. Moreover, we apply body media sensors to record continuous life style and use body media data to index the semantic key frames. In the experiments, we demonstrated various video indexing results which provided their semantic contents and showed Life Log visualizations to examine personal life effectively.
© (2006) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Datchakorn Tancharoen, Datchakorn Tancharoen, Toshihiko Yamasaki, Toshihiko Yamasaki, Kiyoharu Aizawa, Kiyoharu Aizawa, } "Practical life log video indexing based on content and context", Proc. SPIE 6073, Multimedia Content Analysis, Management, and Retrieval 2006, 60730E (16 January 2006); doi: 10.1117/12.650342; https://doi.org/10.1117/12.650342
PROCEEDINGS
8 PAGES


SHARE
Back to Top