Computer-based music feature analysis mirrors human perception and can be used to measure individual music preference

Kai R. Fricke, David M. Greenberg, Peter J. Rentfrow, Philipp Yorck Herzberg

Research output: Contribution to journalArticlepeer-review

28 Scopus citations

Abstract

This paper explores the measurement of individual music feature preference using human- and computer-rated music excerpts. In the first of two studies, we correlated human ratings of song excerpts with computer-extracted music features and found good accordance, as well as similar criterion validity with preference for musical styles (the MUSIC model, mean r = 0.88). In a second online study (N = 2118), using PCA and Procrustes analysis, we found that measured music preference showed the same established three-component structure from previous research (Arousal, Valence, Depth), regardless of whether the music pieces were rated by humans or the ESSENTIA music analysis software. Our results suggest that computer-extracted music features can be used to assess individual music preference.

Original languageEnglish
Pages (from-to)94-102
Number of pages9
JournalJournal of Research in Personality
Volume75
DOIs
StatePublished - Aug 2018
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2018 Elsevier Inc.

Keywords

  • Music features
  • Music information retrieval
  • Music perception
  • Music preference
  • Music taste

Fingerprint

Dive into the research topics of 'Computer-based music feature analysis mirrors human perception and can be used to measure individual music preference'. Together they form a unique fingerprint.

Cite this