##plugins.themes.bootstrap3.article.main##

Ernesto Esmeral-Romero Gustavo Barraza Mercado Johan Mardini

Abstract

Introduction: Music is a natural medium for emotional expression and regulation in everyday life. Recent studies highlight its potential for non-intrusive emotion detection.


Objective: Developing a system to detect users' emotional states from their behavior while listening to music. The goal is to achieve accurate and automated emotional classification.


Method: Music listening data were processed using normalization, feature extraction, and feature selection. Both supervised and unsupervised machine learning algorithms were applied and evaluated.


Results: The proposed system achieved an average classification accuracy of 82.4%, with a precision of 80.9% and a recall of 81.6% across all evaluation scenarios. Feature selection methods, such as Chi-Square and Relief, reduced computation time by approximately 25% while improving model generalization.


Conclusions: Music-listening behavior is an effective source of emotion detection without invasive measurements. The proposed system is compatible with future intelligent and emotion-sensitive applications.

Downloads

Download data is not yet available.

##plugins.themes.bootstrap3.article.details##

How to Cite
Esmeral-Romero, E., Barraza Mercado, G., & Mardini, J. (2025). Automatic Detection of Users’ Emotional States Using Music Listening Data. CESTA, 6(2). https://doi.org/10.17981/cesta.06.02.2025.04
Section
Artículos