Graph Neural Network for Music Style Classification

Authors

  • Katarzyna Nowak Warsaw School of Computer Science Lewartowskiego 17 00-169 Warsaw, Poland Author
  • Tomasz Zieliński Warsaw School of Computer Science Lewartowskiego 17 00-169 Warsaw, Poland Author

DOI:

https://doi.org/10.71465/fair142

Keywords:

Graph Neural Networks, Music Style Classification, Symbolic Music, Deep Learning, Structural Representation, Music Information Retrieval

Abstract

Music style classification plays a fundamental role in music recommendation, retrieval, and organization systems. Traditional classification models primarily rely on audio features or symbolic representations, such as mel-frequency cepstral coefficients or Musical Instrument Digital Interface (MIDI) sequences. However, these models often ignore the rich structural and relational information inherent in musical compositions. This study proposes a novel graph neural network (GNN)-based framework for music style classification that represents each piece of music as a graph, capturing the relationships among notes, chords, and temporal transitions. By modeling these components as interconnected nodes, the GNN is able to learn stylistic features that extend beyond local patterns, such as harmonic progressions, motif repetitions, and inter-note dependencies.

To enhance model performance, the framework incorporates a dual-graph architecture combining intra-piece and inter-piece structures, enabling the GNN to generalize across compositions while retaining individual stylistic identities. Experimental results on publicly available symbolic music datasets demonstrate that the proposed model outperforms traditional convolutional neural network (CNN) and recurrent neural network (RNN)-based models in classification accuracy and robustness across multiple musical genres. These findings highlight the potential of graph-based deep learning for extracting structural patterns critical to music understanding and classification.

Downloads

Download data is not yet available.

Downloads

Published

2025-04-05