Multisensory processing of naturalistic objects in motion

A high-density electrical mapping and source estimation study

Daniel Senkowski, Dave Saint-Amour, Simon P. Kelly, John J. Foxe

Research output: Contribution to journalArticle

59 Citations (Scopus)

Abstract

In everyday life, we continuously and effortlessly integrate the multiple sensory inputs from objects in motion. For instance, the sound and the visual percept of vehicles in traffic provide us with complementary information about the location and motion of vehicles. Here, we used high-density electrical mapping and local auto-regressive average (LAURA) source estimation to study the integration of multisensory objects in motion as reflected in event-related potentials (ERPs). A randomized stream of naturalistic multisensory-audiovisual (AV), unisensory-auditory (A), and unisensory-visual (V) "splash" clips (i.e., a drop falling and hitting a water surface) was presented among non-naturalistic abstract motion stimuli. The visual clip onset preceded the "splash" onset by 100 ms for multisensory stimuli. For naturalistic objects early multisensory integration effects beginning 120-140 ms after sound onset were observed over posterior scalp, with distributed sources localized to occipital cortex, temporal lobule, insular, and medial frontal gyrus (MFG). These effects, together with longer latency interactions (210-250 and 300-350 ms) found in a widespread network of occipital, temporal, and frontal areas, suggest that naturalistic objects in motion are processed at multiple stages of multisensory integration. The pattern of integration effects differed considerably for non-naturalistic stimuli. Unlike naturalistic objects, no early interactions were found for non-naturalistic objects. The earliest integration effects for non-naturalistic stimuli were observed 210-250 ms after sound onset including large portions of the inferior parietal cortex (IPC). As such, there were clear differences in the cortical networks activated by multisensory motion stimuli as a consequence of the semantic relatedness (or lack thereof) of the constituent sensory elements.

Original languageEnglish (US)
Pages (from-to)877-888
Number of pages12
JournalNeuroImage
Volume36
Issue number3
DOIs
StatePublished - Jul 1 2007
Externally publishedYes

Fingerprint

Surgical Instruments
Occipital Lobe
Parietal Lobe
Prefrontal Cortex
Scalp
Evoked Potentials
Semantics
Water

Keywords

  • Auditory
  • Cross-modal
  • Distributed source modeling
  • ERP
  • Visual

ASJC Scopus subject areas

  • Cognitive Neuroscience
  • Neurology

Cite this

Multisensory processing of naturalistic objects in motion : A high-density electrical mapping and source estimation study. / Senkowski, Daniel; Saint-Amour, Dave; Kelly, Simon P.; Foxe, John J.

In: NeuroImage, Vol. 36, No. 3, 01.07.2007, p. 877-888.

Research output: Contribution to journalArticle

Senkowski, Daniel ; Saint-Amour, Dave ; Kelly, Simon P. ; Foxe, John J. / Multisensory processing of naturalistic objects in motion : A high-density electrical mapping and source estimation study. In: NeuroImage. 2007 ; Vol. 36, No. 3. pp. 877-888.
@article{37420dad04c14e6fa9f3a2b5f5471820,
title = "Multisensory processing of naturalistic objects in motion: A high-density electrical mapping and source estimation study",
abstract = "In everyday life, we continuously and effortlessly integrate the multiple sensory inputs from objects in motion. For instance, the sound and the visual percept of vehicles in traffic provide us with complementary information about the location and motion of vehicles. Here, we used high-density electrical mapping and local auto-regressive average (LAURA) source estimation to study the integration of multisensory objects in motion as reflected in event-related potentials (ERPs). A randomized stream of naturalistic multisensory-audiovisual (AV), unisensory-auditory (A), and unisensory-visual (V) {"}splash{"} clips (i.e., a drop falling and hitting a water surface) was presented among non-naturalistic abstract motion stimuli. The visual clip onset preceded the {"}splash{"} onset by 100 ms for multisensory stimuli. For naturalistic objects early multisensory integration effects beginning 120-140 ms after sound onset were observed over posterior scalp, with distributed sources localized to occipital cortex, temporal lobule, insular, and medial frontal gyrus (MFG). These effects, together with longer latency interactions (210-250 and 300-350 ms) found in a widespread network of occipital, temporal, and frontal areas, suggest that naturalistic objects in motion are processed at multiple stages of multisensory integration. The pattern of integration effects differed considerably for non-naturalistic stimuli. Unlike naturalistic objects, no early interactions were found for non-naturalistic objects. The earliest integration effects for non-naturalistic stimuli were observed 210-250 ms after sound onset including large portions of the inferior parietal cortex (IPC). As such, there were clear differences in the cortical networks activated by multisensory motion stimuli as a consequence of the semantic relatedness (or lack thereof) of the constituent sensory elements.",
keywords = "Auditory, Cross-modal, Distributed source modeling, ERP, Visual",
author = "Daniel Senkowski and Dave Saint-Amour and Kelly, {Simon P.} and Foxe, {John J.}",
year = "2007",
month = "7",
day = "1",
doi = "10.1016/j.neuroimage.2007.01.053",
language = "English (US)",
volume = "36",
pages = "877--888",
journal = "NeuroImage",
issn = "1053-8119",
publisher = "Academic Press Inc.",
number = "3",

}

TY - JOUR

T1 - Multisensory processing of naturalistic objects in motion

T2 - A high-density electrical mapping and source estimation study

AU - Senkowski, Daniel

AU - Saint-Amour, Dave

AU - Kelly, Simon P.

AU - Foxe, John J.

PY - 2007/7/1

Y1 - 2007/7/1

N2 - In everyday life, we continuously and effortlessly integrate the multiple sensory inputs from objects in motion. For instance, the sound and the visual percept of vehicles in traffic provide us with complementary information about the location and motion of vehicles. Here, we used high-density electrical mapping and local auto-regressive average (LAURA) source estimation to study the integration of multisensory objects in motion as reflected in event-related potentials (ERPs). A randomized stream of naturalistic multisensory-audiovisual (AV), unisensory-auditory (A), and unisensory-visual (V) "splash" clips (i.e., a drop falling and hitting a water surface) was presented among non-naturalistic abstract motion stimuli. The visual clip onset preceded the "splash" onset by 100 ms for multisensory stimuli. For naturalistic objects early multisensory integration effects beginning 120-140 ms after sound onset were observed over posterior scalp, with distributed sources localized to occipital cortex, temporal lobule, insular, and medial frontal gyrus (MFG). These effects, together with longer latency interactions (210-250 and 300-350 ms) found in a widespread network of occipital, temporal, and frontal areas, suggest that naturalistic objects in motion are processed at multiple stages of multisensory integration. The pattern of integration effects differed considerably for non-naturalistic stimuli. Unlike naturalistic objects, no early interactions were found for non-naturalistic objects. The earliest integration effects for non-naturalistic stimuli were observed 210-250 ms after sound onset including large portions of the inferior parietal cortex (IPC). As such, there were clear differences in the cortical networks activated by multisensory motion stimuli as a consequence of the semantic relatedness (or lack thereof) of the constituent sensory elements.

AB - In everyday life, we continuously and effortlessly integrate the multiple sensory inputs from objects in motion. For instance, the sound and the visual percept of vehicles in traffic provide us with complementary information about the location and motion of vehicles. Here, we used high-density electrical mapping and local auto-regressive average (LAURA) source estimation to study the integration of multisensory objects in motion as reflected in event-related potentials (ERPs). A randomized stream of naturalistic multisensory-audiovisual (AV), unisensory-auditory (A), and unisensory-visual (V) "splash" clips (i.e., a drop falling and hitting a water surface) was presented among non-naturalistic abstract motion stimuli. The visual clip onset preceded the "splash" onset by 100 ms for multisensory stimuli. For naturalistic objects early multisensory integration effects beginning 120-140 ms after sound onset were observed over posterior scalp, with distributed sources localized to occipital cortex, temporal lobule, insular, and medial frontal gyrus (MFG). These effects, together with longer latency interactions (210-250 and 300-350 ms) found in a widespread network of occipital, temporal, and frontal areas, suggest that naturalistic objects in motion are processed at multiple stages of multisensory integration. The pattern of integration effects differed considerably for non-naturalistic stimuli. Unlike naturalistic objects, no early interactions were found for non-naturalistic objects. The earliest integration effects for non-naturalistic stimuli were observed 210-250 ms after sound onset including large portions of the inferior parietal cortex (IPC). As such, there were clear differences in the cortical networks activated by multisensory motion stimuli as a consequence of the semantic relatedness (or lack thereof) of the constituent sensory elements.

KW - Auditory

KW - Cross-modal

KW - Distributed source modeling

KW - ERP

KW - Visual

UR - http://www.scopus.com/inward/record.url?scp=34250335733&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=34250335733&partnerID=8YFLogxK

U2 - 10.1016/j.neuroimage.2007.01.053

DO - 10.1016/j.neuroimage.2007.01.053

M3 - Article

VL - 36

SP - 877

EP - 888

JO - NeuroImage

JF - NeuroImage

SN - 1053-8119

IS - 3

ER -