Graph Neural Network for Music Style Classification
DOI:
https://doi.org/10.71465/fair142Keywords:
Graph Neural Networks, Music Style Classification, Symbolic Music, Deep Learning, Structural Representation, Music Information RetrievalAbstract
Music style classification plays a fundamental role in music recommendation, retrieval, and organization systems. Traditional classification models primarily rely on audio features or symbolic representations, such as mel-frequency cepstral coefficients or Musical Instrument Digital Interface (MIDI) sequences. However, these models often ignore the rich structural and relational information inherent in musical compositions. This study proposes a novel graph neural network (GNN)-based framework for music style classification that represents each piece of music as a graph, capturing the relationships among notes, chords, and temporal transitions. By modeling these components as interconnected nodes, the GNN is able to learn stylistic features that extend beyond local patterns, such as harmonic progressions, motif repetitions, and inter-note dependencies.
To enhance model performance, the framework incorporates a dual-graph architecture combining intra-piece and inter-piece structures, enabling the GNN to generalize across compositions while retaining individual stylistic identities. Experimental results on publicly available symbolic music datasets demonstrate that the proposed model outperforms traditional convolutional neural network (CNN) and recurrent neural network (RNN)-based models in classification accuracy and robustness across multiple musical genres. These findings highlight the potential of graph-based deep learning for extracting structural patterns critical to music understanding and classification.
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Katarzyna Nowak, Tomasz Zieliński (Author)

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.