Webbläsaren som du använder stöds inte av denna webbplats. Alla versioner av Internet Explorer stöds inte längre, av oss eller Microsoft (läs mer här: * https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Var god och använd en modern webbläsare för att ta del av denna webbplats, som t.ex. nyaste versioner av Edge, Chrome, Firefox eller Safari osv.

Nonlinguistic vocalizations from online amateur videos for emotion research : A validated corpus

Författare

Summary, in English

This study introduces a corpus of 260 naturalistic human nonlinguistic vocalizations representing nine emotions: amusement, anger, disgust, effort, fear, joy, pain, pleasure, and sadness. The recognition accuracy in a rating task varied greatly per emotion, from <40% for joy and pain, to >70% for amusement, pleasure, fear, and sadness. In contrast, the raters’ linguistic–cultural group had no effect on recognition accuracy: The predominantly English-language corpus was classified with similar accuracies by participants from Brazil, Russia, Sweden, and the UK/USA. Supervised random forest models classified the sounds as accurately as the human raters. The best acoustic predictors of emotion were pitch, harmonicity, and the spacing and regularity of syllables. This corpus of ecologically valid emotional vocalizations can be filtered to include only sounds with high recognition rates, in order to study reactions to emotional stimuli of known perceptual types (reception side), or can be used in its entirety to study the association between affective states and vocal expressions (production side).

Avdelning/ar

Publiceringsår

2017-04-29

Språk

Engelska

Sidor

758-771

Publikation/Tidskrift/Serie

Behavior Research Methods

Volym

49

Issue

2

Dokumenttyp

Artikel i tidskrift

Förlag

Springer

Ämne

  • Psychology (excluding Applied Psychology)

Nyckelord

  • Emotion
  • Nonlinguistic vocalizations
  • Naturalistic vocalizations
  • Acoustic analysis

Status

Published

Forskningsgrupp

  • LUCS Cognitive Zoology Group

ISBN/ISSN/Övrigt

  • ISSN: 1554-3528