Emotional Efects of Music Using Machine Learning Analytics

Date
2017
Authors
Panwar, Sharaj
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract

Music Information Retrieval (MIR) and Music Emotion Recognition (MER) research have greatly influenced the musical world. MIR and MER are embedding data mining or machine learning techniques with several types of music features and annotations. Music as an organized sound, resonates with our nerve tissues and creates an emotional response. Musical Perception is the auditory perception of musical sound as meaningful phenomena. A machine learning music perception model is proposed, which can detect the music information of a given audio file in terms of Genres and Emotions, to study the emotional effects of music. Genre classification is performed using a hybrid convolutional recurrent neural network model and emotion recognition is performed by mapping musical acoustic features to corresponding arousal and valence emotion indexes using linear regression model. A Radio Induced Emotion Dataset (RIED) is created by continuously observing radio song broadcast on five major cities (New York, Las Angels, Houston, Miami) of five different regions of The United States of America from 10/21/2017 to 11/21/2017. A part of dataset containing songs aired on 10/23/2017 for respective cities is tested on proposed perception model to observe music emotion propensity of different regions of United States.

Description
This item is available only to currently enrolled UTSA students, faculty or staff.
Keywords
Citation
Department
Electrical and Computer Engineering