The EU-Emotion Voice Database

Amandine Lassalle, Delia Pigat, Helen O’Reilly, Steve Berggen, Shimrit Fridenson-Hayo, Shahar Tal, Sigrid Elfström, Anna Råde, Ofer Golan, Sven Bölte, Simon Baron-Cohen, Daniel Lundqvist

Research output: Contribution to journalArticlepeer-review

20 Scopus citations

Abstract

In this study, we report the validation results of the EU-Emotion Voice Database, an emotional voice database available for scientific use, containing a total of 2,159 validated emotional voice stimuli. The EU-Emotion voice stimuli consist of audio-recordings of 54 actors, each uttering sentences with the intention of conveying 20 different emotional states (plus neutral). The database is organized in three separate emotional voice stimulus sets in three different languages (British English, Swedish, and Hebrew). These three sets were independently validated by large pools of participants in the UK, Sweden, and Israel. Participants’ validation of the stimuli included emotion categorization accuracy and ratings of emotional valence, intensity, and arousal. Here we report the validation results for the emotional voice stimuli from each site and provide validation data to download as a supplement, so as to make these data available to the scientific community. The EU-Emotion Voice Database is part of the EU-Emotion Stimulus Set, which in addition contains stimuli of emotions expressed in the visual modality (by facial expression, body language, and social scene) and is freely available to use for academic research purposes.

Original languageEnglish
Pages (from-to)493-506
Number of pages14
JournalBehavior Research Methods
Volume51
Issue number2
DOIs
StatePublished - 15 Apr 2019

Bibliographical note

Publisher Copyright:
© 2018, The Author(s).

Funding

Author note The research leading to these results received funding from the European Community’s Seventh Framework Programme (FP7/2007-2013) under Grant Agreement No. 289021 (www.asc-inclusion.eu). S.B. was supported by the Swedish Research Council (Grant No. 523-2009-7054), and S.B.-C. was supported by the Autism Research Trust, the MRC, the Wellcome Trust, and the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care East of England, at the Cambridgeshire and Peterborough NHS Foundation Trust. The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR, or the Department of Health.

FundersFunder number
Autism Research Trust
FP7/2007
Wellcome Trust
Seventh Framework Programme289021
Medical Research CouncilG0600977
National Institute for Health Research
Vetenskapsrådet523-2009-7054
Seventh Framework Programme

    Keywords

    • Emotion perception
    • Multisite validation
    • Voice stimuli set

    Fingerprint

    Dive into the research topics of 'The EU-Emotion Voice Database'. Together they form a unique fingerprint.

    Cite this