Classification of emotion using sub audible frequencies in vocal data

Date

2008-05-15T15:35:34Z

Journal Title

Journal ISSN

Volume Title

Publisher

Kansas State University

Abstract

Current research involving vocal emotion detection has taken a variety of different approaches, and has found certain acoustic attributes to characterize different emotional states. While there have been improvements in classification over the past few years, computer classification is not nearly as accurate as human classification. This paper proposes the existence of an attribute that has not been examined, which can be used as a measure for detecting emotion in human vocal samples. It is shown that the new infrasonic attribute is significant when examining agitated emotions. Therefore, it can be used to help improve vocal emotion detection.

Description

Keywords

Emotion, Computer, Detecting, Stress

Graduation Month

May

Degree

Master of Science

Department

Department of Computing and Information Sciences

Major Professor

David A. Gustafson

Date

2008

Type

Thesis

Citation