Audio-visual multisensory integration in superior parietal lobule revealed by human intracranial recordings

Sophie Molholm, Pejman Sehatpour, Ashesh D. Mehta, Marina Shpaner, Manuel Gomez-Ramirez, Stephanie Ortigue, Jonathan P. Dyke, Theodore H. Schwartz, John J. Foxe

Research output: Contribution to journalArticlepeer-review

152 Scopus citations

Abstract

Intracranial recordings from three human subjects provide the first direct electrophysiological evidence for audio-visual multisensory processing in the human superior parietal lobule (SPL). Auditory and visual sensory inputs project to the same highly localized region of the parietal cortex with auditory inputs arriving considerably earlier (30 ms) than visual inputs (75 ms). Multisensory integration processes in this region were assessed by comparing the response to simultaneous audio-visual stimulation with the algebraic sum of responses to the constituent auditory and visual unisensory stimulus conditions. Significant integration effects were seen with almost identical morphology across the three subjects, beginning between 120 and 160 ms. These results are discussed in the context of the role of SPL in supramodal spatial attention and sensory-motor transformations.

Original languageEnglish (US)
Pages (from-to)721-729
Number of pages9
JournalJournal of neurophysiology
Volume96
Issue number2
DOIs
StatePublished - 2006
Externally publishedYes

ASJC Scopus subject areas

  • General Neuroscience
  • Physiology

Fingerprint

Dive into the research topics of 'Audio-visual multisensory integration in superior parietal lobule revealed by human intracranial recordings'. Together they form a unique fingerprint.

Cite this