21 May 2015 Emotion recognition from speech: tools and challenges
Author Affiliations +
Abstract
Human emotion recognition from speech is studied frequently for its importance in many applications, e.g. human-computer interaction. There is a wide diversity and non-agreement about the basic emotion or emotion-related states on one hand and about where the emotion related information lies in the speech signal on the other side. These diversities motivate our investigations into extracting Meta-features using the PCA approach, or using a non-adaptive random projection RP, which significantly reduce the large dimensional speech feature vectors that may contain a wide range of emotion related information. Subsets of Meta-features are fused to increase the performance of the recognition model that adopts the score-based LDC classifier. We shall demonstrate that our scheme outperform the state of the art results when tested on non-prompted databases or acted databases (i.e. when subjects act specific emotions while uttering a sentence). However, the huge gap between accuracy rates achieved on the different types of datasets of speech raises questions about the way emotions modulate the speech. In particular we shall argue that emotion recognition from speech should not be dealt with as a classification problem. We shall demonstrate the presence of a spectrum of different emotions in the same speech portion especially in the non-prompted data sets, which tends to be more “natural” than the acted datasets where the subjects attempt to suppress all but one emotion.
© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Abdulbasit Al-Talabani, Abdulbasit Al-Talabani, Harin Sellahewa, Harin Sellahewa, Sabah A. Jassim, Sabah A. Jassim, } "Emotion recognition from speech: tools and challenges", Proc. SPIE 9497, Mobile Multimedia/Image Processing, Security, and Applications 2015, 94970N (21 May 2015); doi: 10.1117/12.2191623; https://doi.org/10.1117/12.2191623
PROCEEDINGS
8 PAGES


SHARE
Back to Top