Modeling Emotion in Complex Stories: The Stanford Emotional Narratives Dataset

Desmond C. Ong, Zhengxuan Wu, Tan Zhi-Xuan, Marianne Reddan, Isabella Kahhale, Alison Mattek, Jamil Zaki

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

Human emotions unfold over time, and more affective computing research has to prioritize capturing this crucial component of real-world affect. Modeling dynamic emotional stimuli requires solving the twin challenges of time-series modeling and of collecting high-quality time-series datasets. We begin by assessing the state-of-the-art in time-series emotion recognition, and we review contemporary time-series approaches in affective computing, including discriminative and generative models. We then introduce the first version of the Stanford Emotional Narratives Dataset (SENDv1): a set of rich, multimodal videos of self-paced, unscripted emotional narratives, annotated for emotional valence over time. The complex narratives and naturalistic expressions in this dataset provide a challenging test for contemporary time-series emotion recognition models. We demonstrate several baseline and state-of-the-art modeling approaches on the SEND, including a Long Short-Term Memory model and a multimodal Variational Recurrent Neural Network, which perform comparably to the human-benchmark. We end by discussing the implications for future research in time-series affective computing.

Original languageEnglish (US)
Article number8913483
Pages (from-to)579-594
Number of pages16
JournalIEEE Transactions on Affective Computing
Volume12
Issue number3
DOIs
StatePublished - Jul 1 2021
Externally publishedYes

Keywords

  • affect sensing and analysis
  • Affective computing
  • emotional corpora
  • multi-modal recognition

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Modeling Emotion in Complex Stories: The Stanford Emotional Narratives Dataset'. Together they form a unique fingerprint.

Cite this