Abstract
A system is presented for detecting common gestures, musical intentions and emotions of pianists in real-time using kinesthetic data retrieved by wireless motion sensors. The algorithm can detect six performer intended emotions such as cheerful, mournful, and vigorous, completely and solely based on low-sample-rate motion sensor data. The algorithm can be trained in real-time or can work based on previous training sets. Based on the classification, the system offers feedback in by mapping the emotions to a color set and presenting them as a flowing emotional spectrum on the background of a piano roll. It also presents a small circular object floating in the emotion space of Hevner’s adjective circle. This allows a performer to get real-time feedback regarding the emotional content conveyed in the performance. The system was trained and tested using the standard paradigm on a group of pianists, detected and displayed structures and emotions, and it provided some insightful results and conclusions.
Original language | English |
---|---|
Pages (from-to) | 21-28 |
Number of pages | 8 |
Journal | Proceedings of the International Conference on New Interfaces for Musical Expression |
State | Published - 2013 |
Externally published | Yes |
Event | 13th International conference on New Interfaces for Musical Expression, NIME 2013 - Daejeon, Korea, Republic of Duration: 27 May 2013 → 30 May 2013 |
Bibliographical note
Publisher Copyright:© 2013, Steering Committee of the International Conference on New Interfaces for Musical Expression. All rights reserved.
Keywords
- Computer Music
- Expressive Piano Performance
- IMUs
- Machine Learning
- Motion Sensors
- Music and Emotion