Distinct nature of directional signals among parietal cortical areas during visual guidance

Emad N. Eskandar, John A. Assad

Research output: Contribution to journalArticle

36 Scopus citations

Abstract

We examined neuronal signals in the monkey medial superior temporal area (MST), the medial intraparietal area (MIP), and the lateral intraparietal area (LIP) during visually guided hand movements. Two animals were trained to use a joystick to guide a spot to a target. Many neurons responded in a direction-selective manner in this guidance task. We tested whether the direction selectivity depended on the direction of the stimulus spot or the direction of the hand movement. First, in some trials, the moving spot disappeared transiently. Second, the mapping between the hand direction and the spot direction was reversed on alternate blocks of trials. Third, we recorded the spot's movement while the animals moved the joystick and then played back that movement while the animals fixated without moving the joystick. Neurons in the three parietal areas conveyed distinct directional information. MST neurons were active and directional only on visible trials in both joystick-movement mode and playback mode and were not affected by the direction of hand movement. MIP neurons were mainly directional with respect to the hand movement, although some MIP neurons were also selective for stimulus direction. MIP neurons were much less active in playback mode. LIP neurons were active and directional in both joystick-movement mode and playback mode. Directional signals in LIP were unrelated to planning saccades. The selectivity of LIP neurons also became evident hundreds of milliseconds before the start of movement. Since the direction of movement was consistent throughout a block of trials, these signals could provide a prediction of the upcoming direction of motion. We tested this by alternating blocks of trials in which the direction was consistent or randomized. The direction selectivity developed earlier on trials in which the upcoming direction could be predicted. These results suggest that LIP neurons combine "bottom-up" visual motion signals with extraretinal, predictive signals about stimulus motion.

Original languageEnglish (US)
Pages (from-to)1777-1790
Number of pages14
JournalJournal of neurophysiology
Volume88
Issue number4
StatePublished - Oct 1 2002
Externally publishedYes

    Fingerprint

ASJC Scopus subject areas

  • Neuroscience(all)
  • Physiology

Cite this