TY - JOUR
T1 - Multisensory processing of naturalistic objects in motion
T2 - A high-density electrical mapping and source estimation study
AU - Senkowski, Daniel
AU - Saint-Amour, Dave
AU - Kelly, Simon P.
AU - Foxe, John J.
N1 - Funding Information:
This work was supported by grants from the U.S. National Institute of Mental Health (MH65350) and the National Institute of Aging (AG22696) to Dr. J.J. Foxe. We would like to express our sincere appreciation to Marina Shpaner and Jennifer Montesi for their technical assistance and to two anonymous reviewers for their helpful comments on this article. The Cartool software ( http://brainmapping.unige.ch/Cartool.php ) has been programmed by Denis Brunet, from the Functional Brain Mapping Laboratory, Geneva, Switzerland, and is supported by the Center for Biomedical Imaging (CIBM) of Geneva and Lausanne. We are also very grateful to Martin Winters from the North Carolina School of Science and Mathematics for his permission to use the splash clips for this experiment. The original clip and further clips can be found at: http://courses.ncssm.edu/hsi/splashes/video/video.htm .
PY - 2007/7/1
Y1 - 2007/7/1
N2 - In everyday life, we continuously and effortlessly integrate the multiple sensory inputs from objects in motion. For instance, the sound and the visual percept of vehicles in traffic provide us with complementary information about the location and motion of vehicles. Here, we used high-density electrical mapping and local auto-regressive average (LAURA) source estimation to study the integration of multisensory objects in motion as reflected in event-related potentials (ERPs). A randomized stream of naturalistic multisensory-audiovisual (AV), unisensory-auditory (A), and unisensory-visual (V) "splash" clips (i.e., a drop falling and hitting a water surface) was presented among non-naturalistic abstract motion stimuli. The visual clip onset preceded the "splash" onset by 100 ms for multisensory stimuli. For naturalistic objects early multisensory integration effects beginning 120-140 ms after sound onset were observed over posterior scalp, with distributed sources localized to occipital cortex, temporal lobule, insular, and medial frontal gyrus (MFG). These effects, together with longer latency interactions (210-250 and 300-350 ms) found in a widespread network of occipital, temporal, and frontal areas, suggest that naturalistic objects in motion are processed at multiple stages of multisensory integration. The pattern of integration effects differed considerably for non-naturalistic stimuli. Unlike naturalistic objects, no early interactions were found for non-naturalistic objects. The earliest integration effects for non-naturalistic stimuli were observed 210-250 ms after sound onset including large portions of the inferior parietal cortex (IPC). As such, there were clear differences in the cortical networks activated by multisensory motion stimuli as a consequence of the semantic relatedness (or lack thereof) of the constituent sensory elements.
AB - In everyday life, we continuously and effortlessly integrate the multiple sensory inputs from objects in motion. For instance, the sound and the visual percept of vehicles in traffic provide us with complementary information about the location and motion of vehicles. Here, we used high-density electrical mapping and local auto-regressive average (LAURA) source estimation to study the integration of multisensory objects in motion as reflected in event-related potentials (ERPs). A randomized stream of naturalistic multisensory-audiovisual (AV), unisensory-auditory (A), and unisensory-visual (V) "splash" clips (i.e., a drop falling and hitting a water surface) was presented among non-naturalistic abstract motion stimuli. The visual clip onset preceded the "splash" onset by 100 ms for multisensory stimuli. For naturalistic objects early multisensory integration effects beginning 120-140 ms after sound onset were observed over posterior scalp, with distributed sources localized to occipital cortex, temporal lobule, insular, and medial frontal gyrus (MFG). These effects, together with longer latency interactions (210-250 and 300-350 ms) found in a widespread network of occipital, temporal, and frontal areas, suggest that naturalistic objects in motion are processed at multiple stages of multisensory integration. The pattern of integration effects differed considerably for non-naturalistic stimuli. Unlike naturalistic objects, no early interactions were found for non-naturalistic objects. The earliest integration effects for non-naturalistic stimuli were observed 210-250 ms after sound onset including large portions of the inferior parietal cortex (IPC). As such, there were clear differences in the cortical networks activated by multisensory motion stimuli as a consequence of the semantic relatedness (or lack thereof) of the constituent sensory elements.
KW - Auditory
KW - Cross-modal
KW - Distributed source modeling
KW - ERP
KW - Visual
UR - http://www.scopus.com/inward/record.url?scp=34250335733&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=34250335733&partnerID=8YFLogxK
U2 - 10.1016/j.neuroimage.2007.01.053
DO - 10.1016/j.neuroimage.2007.01.053
M3 - Article
C2 - 17481922
AN - SCOPUS:34250335733
SN - 1053-8119
VL - 36
SP - 877
EP - 888
JO - NeuroImage
JF - NeuroImage
IS - 3
ER -