02

Mar' 2020
Research

Interpreting emotion in Odia dialects for efficient human-machine interaction

Researchers at the Signal Processing research group at SiliconTech are developing android apps for automatic emotion analysis from the voice data. They have a database in Odia language for recognition of emotional content in speech towards efficient human-machine interaction.

Understanding one’s feelings at the time of communication is constructive in comprehending the conversation and responding appropriately. Currently, this part of human–computer interaction has not yet entirely been solved, and except for a limited number of applications, there is no general solution to this problem.

In recent years in the context of the rapid development of Artificial Intelligence, speech emotion recognition is gradually becoming a challenging research. Human language communication has two modes, written mode and spoken mode. Though linguistic information is the same in both the modes, the latter also carries some paralinguistic information.

The paralinguistic information refers to the speech sounds that carry other speaker related information like, age, sex, emotion and attitude.The most important and complex of them is emotion.

Recognizing emotional content in speech is an indispensable requirement for efficient human-machine interaction.