Multisensory processing of naturalistic objects in motion: A high-density electrical mapping and source estimation study

Daniel Senkowski, Dave Saint-Amour, Simon P. Kelly, John J. Foxe

Research output: Contribution to journalArticle

62 Scopus citations

Abstract

In everyday life, we continuously and effortlessly integrate the multiple sensory inputs from objects in motion. For instance, the sound and the visual percept of vehicles in traffic provide us with complementary information about the location and motion of vehicles. Here, we used high-density electrical mapping and local auto-regressive average (LAURA) source estimation to study the integration of multisensory objects in motion as reflected in event-related potentials (ERPs). A randomized stream of naturalistic multisensory-audiovisual (AV), unisensory-auditory (A), and unisensory-visual (V) "splash" clips (i.e., a drop falling and hitting a water surface) was presented among non-naturalistic abstract motion stimuli. The visual clip onset preceded the "splash" onset by 100 ms for multisensory stimuli. For naturalistic objects early multisensory integration effects beginning 120-140 ms after sound onset were observed over posterior scalp, with distributed sources localized to occipital cortex, temporal lobule, insular, and medial frontal gyrus (MFG). These effects, together with longer latency interactions (210-250 and 300-350 ms) found in a widespread network of occipital, temporal, and frontal areas, suggest that naturalistic objects in motion are processed at multiple stages of multisensory integration. The pattern of integration effects differed considerably for non-naturalistic stimuli. Unlike naturalistic objects, no early interactions were found for non-naturalistic objects. The earliest integration effects for non-naturalistic stimuli were observed 210-250 ms after sound onset including large portions of the inferior parietal cortex (IPC). As such, there were clear differences in the cortical networks activated by multisensory motion stimuli as a consequence of the semantic relatedness (or lack thereof) of the constituent sensory elements.

Original languageEnglish (US)
Pages (from-to)877-888
Number of pages12
JournalNeuroImage
Volume36
Issue number3
DOIs
StatePublished - Jul 1 2007

    Fingerprint

Keywords

  • Auditory
  • Cross-modal
  • Distributed source modeling
  • ERP
  • Visual

ASJC Scopus subject areas

  • Neurology
  • Cognitive Neuroscience

Cite this