Help Privacy Policy Disclaimer
  Advanced SearchBrowse




Journal Article

TenseMusic: An automatic prediction model for musical tension


Barchet,  Alice Vivien
Department of Cognitive Neuropsychology, Max Planck Institute for Empirical Aesthetics, Max Planck Society;


Rimmele,  Johanna Maria       
Department of Cognitive Neuropsychology, Max Planck Institute for Empirical Aesthetics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

(Publisher version), 2MB

Supplementary Material (public)
There is no public supplementary material available

Barchet, A. V., Rimmele, J. M., & Pelofi, C. (2024). TenseMusic: An automatic prediction model for musical tension. PLOS ONE, 19(1): e0296385. doi:10.1371/journal.pone.0296385.

Cite as: https://hdl.handle.net/21.11116/0000-000E-569D-1
The perception of tension and release dynamics constitutes one of the essential aspects of music listening. However, modeling musical tension to predict perception of listeners has been a challenge to researchers. Seminal work demonstrated that tension is reported consistently by listeners and can be accurately predicted from a discrete set of musical features, combining them into a weighted sum of slopes reflecting their combined dynamics over time. However, previous modeling approaches lack an automatic pipeline for feature extraction that would make them widely accessible to researchers in the field. Here, we present TenseMusic: an open-source automatic predictive tension model that operates with a musical audio as the only input. Using state-of-the-art music information retrieval (MIR) methods, it automatically extracts a set of six features (i.e., loudness, pitch height, tonal tension, roughness, tempo, and onset frequency) to use as predictors for musical tension. The algorithm was optimized using Lasso regression to best predict behavioral tension ratings collected on 38 Western classical musical pieces. Its performance was then tested by assessing the correlation between the predicted tension and unseen continuous behavioral tension ratings yielding large mean correlations between ratings and predictions approximating r = .60 across all pieces. We hope that providing the research community with this well-validated open-source tool for predicting musical tension will motivate further work in music cognition and contribute to elucidate the neural and cognitive correlates of tension dynamics for various musical genres and cultures.