• Publications of Jennifer M. Groh

      • Journal Articles

          • JM Groh and J Lee.
          • 2014.
          • Different Stimuli, Different Spatial Codes: A Visual Map and an Auditory Rate Code for Oculomotor Space in the Primate Superior Colliculus.
          • PLoS One
          • 9:
          • .
          Publication Description

          Maps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive neurons whose firing rates vary roughly proportionately with sound azimuth. Within frontal space, maps and such rate codes involve different response patterns at the level of individual neurons. Maps consist of neurons exhibiting circumscribed receptive fields, whereas rate codes involve open-ended response patterns that peak in the periphery. This coding format discrepancy therefore poses a potential problem for brain regions responsible for representing both visual and auditory information. Here, we investigated the coding of auditory space in the primate superior colliculus(SC), a structure known to contain visual and oculomotor maps for guiding saccades. We report that, for visual stimuli, neurons showed circumscribed receptive fields consistent with a map, but for auditory stimuli, they had open-ended response patterns consistent with a rate or level-of-activity code for location. The discrepant response patterns were not segregated into different neural populations but occurred in the same neurons. We show that a read-out algorithm in which the site and level of SC activity both contribute to the computation of stimulus location is successful at evaluating the discrepant visual and auditory codes, and can account for subtle but systematic differences in the accuracy of auditory compared to visual saccades. This suggests that a given population of neurons can use different codes to support appropriate multimodal behavior.

          • NL Zucker, RM Merwin, CM Bulik, A Moskovich, JE Wildes and J Groh.
          • 2013.
          • Subjective experience of sensation in anorexia nervosa..
          • Behav Res Ther
          • 51:
          • 256-265
          • .
          Publication Description

          The nature of disturbance in body experience in anorexia nervosa (AN) remains poorly operationalized despite its prognostic significance. We examined the relationship of subjective reports of sensitivity to and behavioral avoidance of sensory experience (e.g., to touch, motion) to body image disturbance and temperament in adult women currently diagnosed with AN (n = 20), women with a prior history of AN who were weight restored (n = 15), and healthy controls with no eating disorder history (n = 24). Levels of sensitivity to sensation and attempts to avoid sensory experience were significantly higher in both clinical groups relative to healthy controls. Sensory sensitivity was associated with body image disturbance (r(56) = .51, p 

          • DS Pages and JM Groh.
          • 2013.
          • Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization..
          • PLoS One
          • 8:
          • e72562
          • .
          Publication Description

          A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a 'guess and check' heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain's reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.

          • J Lee and JM Groh.
          • 2012.
          • Auditory signals evolve from hybrid- to eye-centered coordinates in the primate superior colliculus..
          • J Neurophysiol
          • 108:
          • 227-242
          • .
          Publication Description

          Visual and auditory spatial signals initially arise in different reference frames. It has been postulated that auditory signals are translated from a head-centered to an eye-centered frame of reference compatible with the visual spatial maps, but, to date, only various forms of hybrid reference frames for sound have been identified. Here, we show that the auditory representation of space in the superior colliculus involves a hybrid reference frame immediately after the sound onset but evolves to become predominantly eye centered, and more similar to the visual representation, by the time of a saccade to that sound. Specifically, during the first 500 ms after the sound onset, auditory response patterns (N = 103) were usually neither head nor eye centered: 64% of neurons showed such a hybrid pattern, whereas 29% were more eye centered and 8% were more head centered. This differed from the pattern observed for visual targets (N = 156): 86% were eye centered,

          • KG Gruters and JM Groh.
          • 2012.
          • Sounds and beyond: multisensory and other non-auditory signals in the inferior colliculus..
          • Front Neural Circuits
          • 6:
          • 96
          • .
          Publication Description

          The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory sources. Neurophysiological studies corroborate that non-auditory stimuli can modulate auditory processing in the IC and even elicit responses independent of coincident auditory stimulation. In this article, we review anatomical and physiological evidence for multisensory and other non-auditory processing in the IC. Specifically, the contributions of signals related to vision, eye movements and position, somatosensation, and behavioral context to neural activity in the IC will be described. These signals are potentially important for localizing sound sources, attending to salient stimuli, distinguishing environmental from self-generated sounds, and perceiving and generating communication sounds. They suggest that the IC should be thought of as a node in a highly interconnected sensory, motor, and cognitive network dedicated to synthesizing a higher-order auditory percept rather than simply reporting patterns of air pressure detected by the cochlea. We highlight some of the potential pitfalls that can arise from experimental manipulations that may disrupt the normal function of this network, such as the use of anesthesia or the severing of connections from cortical structures that project to the IC. Finally, we note that the presence of these signals in the IC has implications for our understanding not just of the IC but also of the multitude of other regions within and beyond the auditory system that are dependent on signals that pass through the IC. Whatever the IC "hears" would seem to be passed both "upward" to thalamus and thence to auditory cortex and beyond, as well as "downward" via centrifugal connections to earlier areas of the auditory pathway such as the cochlear nucleus.

          • DA Bulkin and JM Groh.
          • 2012.
          • Distribution of visual and saccade related information in the monkey inferior colliculus.
          • Frontiers in Neural Circuits
          • 6:
          • 61
          • .
          Publication Description

          The inferior colliculus (IC) is an essential stop early in the ascending auditory pathway. Though normally thought of as a predominantly auditory structure, recent work has uncovered a variety of non-auditory influences on firing rate in the IC. Here, we map the location within the IC of neurons that respond to the onset of a fixation-guiding visual stimulus. Visual/visuomotor associated activity was found throughout the IC (overall, 84 of 199 sites tested or 42%), but with a far reduced prevalence and strength along recording penetrations passing through the tonotopically organized region of the IC, putatively the central nucleus (11 of 42 sites tested, or 26%). These results suggest that visual information has only a weak effect on early auditory processing in core regions, but more strongly targets the modulatory shell regions of the IC. © 2012 Bulkin and Groh.

          • DA Bulkin and JM Groh.
          • 2012.
          • Distribution of eye position information in the monkey inferior colliculus.
          • Journal of Neurophysiology
          • 107:
          • 785-795
          • .
          Publication Description

          The inferior colliculus (IC) is thought to have two main subdivisions, a central region that forms an important stop on the ascending auditory pathway and a surrounding shell region that may play a more modulatory role. In this study, we investigated whether eye position affects activity in both the central and shell regions. Accordingly, we mapped the location of eye position-sensitive neurons in six monkeys making spontaneous eye movements by sampling multiunit activity at regularly spaced intervals throughout the IC. We used a functional map based on auditory response patterns to estimate the anatomical location of recordings, in conjunction with structural MRI and histology. We found eye position-sensitive sites throughout the IC, including at 27% of sites in tonotopically organized recording penetrations (putatively the central nucleus). Recordings from surrounding tissue showed a larger proportion of sites indicating an influence of eye position (33-43%). When present, the magnitude of the change in activity due to eye position was often comparable to that seen for sound frequency. Our results indicate that the primary ascending auditory pathway is influenced by the position of the eyes. Because eye position is essential for visual-auditory integration, our findings suggest that computations underlying visual-auditory integration begin early in the ascending auditory pathway. © 2012 the American Physiological Society.

          • DA Bulkin, JM Groh.
          • 2012.
          • Distribution of visual and saccade related information in the monkey inferior colliculus..
          • Frontiers in neural circuits
          • 6:
          • 61
          • .
          Publication Description

          The inferior colliculus (IC) is an essential stop early in the ascending auditory pathway. Though normally thought of as a predominantly auditory structure, recent work has uncovered a variety of non-auditory influences on firing rate in the IC. Here, we map the location within the IC of neurons that respond to the onset of a fixation-guiding visual stimulus. Visual/visuomotor associated activity was found throughout the IC (overall, 84 of 199 sites tested or 42%), but with a far reduced prevalence and strength along recording penetrations passing through the tonotopically organized region of the IC, putatively the central nucleus (11 of 42 sites tested, or 26%). These results suggest that visual information has only a weak effect on early auditory processing in core regions, but more strongly targets the modulatory shell regions of the IC.

          • DA Bulkin and JM Groh.
          • 2011.
          • Systematic mapping of the monkey inferior colliculus reveals enhanced low frequency sound representation..
          • J Neurophysiol
          • 105:
          • 1785-1797
          • .
          Publication Description

          We investigated the functional architecture of the inferior colliculus (IC) in rhesus monkeys. We systematically mapped multiunit responses to tonal stimuli and noise in the IC and surrounding tissue of six rhesus macaques, collecting data at evenly placed locations and recording nonresponsive locations to define boundaries. The results show a modest tonotopically organized region (17 of 100 recording penetration locations in 4 of 6 monkeys) surrounded by a large mass of tissue that, although vigorously responsive, showed no clear topographic arrangement (68 of 100 penetration locations). Rather, most cells in these recordings responded best to frequencies at the low end of the macaque auditory range. The remaining 15 (of 100) locations exhibited auditory responses that were not sensitive to sound frequency. Potential anatomical correlates of functionally defined regions and implications for midbrain auditory prosthetic devices are discussed.

          • JM Groh.
          • 2011.
          • Effects of Initial Eye Position on Saccades Evoked by Microstimulation in the Primate Superior Colliculus: Implications for Models of the SC Read-Out Process..
          • Front Integr Neurosci
          • 4:
          • 130
          • .
          Publication Description

          The motor layers of the superior colliculus (SC) are thought to specify saccade amplitude and direction, independent of initial eye position. However, recent evidence suggests that eye position can modulate the level of activity of SC motor neurons. In this study, we tested whether initial eye position has an effect on microstimulation-evoked saccade amplitude. High (>300 Hz) and low (

          • JM Groh.
          • 2011.
          • Effects of Initial Eye Position on Saccades Evoked by Microstimulation in the Primate Superior Colliculus: Implications for Models of the SC Read-Out Process..
          • Frontiers in integrative neuroscience
          • 4:
          • 130
          • .
          Publication Description

          The motor layers of the superior colliculus (SC) are thought to specify saccade amplitude and direction, independent of initial eye position. However, recent evidence suggests that eye position can modulate the level of activity of SC motor neurons. In this study, we tested whether initial eye position has an effect on microstimulation-evoked saccade amplitude. High (>300 Hz) and low (

          • JX Maier and JM Groh.
          • 2010.
          • Comparison of gain-like properties of eye position signals in inferior colliculus versus auditory cortex of primates..
          • Front Integr Neurosci
          • 4:
          • 121-132
          • .
          Publication Description

          We evaluated to what extent the influence of eye position in the auditory pathway of primates can be described as a gain field. We compared single unit activity in the inferior colliculus (IC), core auditory cortex (A1) and the caudomedial belt (CM) region of auditory cortex (AC) in primates, and found stronger evidence for gain field-like interactions in the IC than in AC. In the IC, eye position signals showed both multiplicative and additive interactions with auditory responses, whereas in AC the effects were not as well predicted by a gain field model.

          • JX Maier and JM Groh.
          • 2009.
          • Multisensory guidance of orienting behavior..
          • Hear Res
          • 258:
          • 106-112
          • .
          Publication Description

          We use both vision and audition when localizing objects and events in our environment. However, these sensory systems receive spatial information in different coordinate systems: sounds are localized using inter-aural and spectral cues, yielding a head-centered representation of space, whereas the visual system uses an eye-centered representation of space, based on the site of activation on the retina. In addition, the visual system employs a place-coded, retinotopic map of space, whereas the auditory system's representational format is characterized by broad spatial tuning and a lack of topographical organization. A common view is that the brain needs to reconcile these differences in order to control behavior, such as orienting gaze to the location of a sound source. To accomplish this, it seems that either auditory spatial information must be transformed from a head-centered rate code to an eye-centered map to match the frame of reference used by the visual system, or vice versa. Here, we review a number of studies that have focused on the neural basis underlying such transformations in the primate auditory system. Although, these studies have found some evidence for such transformations, many differences in the way the auditory and visual system encode space exist throughout the auditory pathway. We will review these differences at the neural level, and will discuss them in relation to differences in the way auditory and visual information is used in guiding orienting movements.

          • N Kopco, IF Lin, BG Shinn-Cunningham and JM Groh.
          • 2009.
          • Reference frame of the ventriloquism aftereffect..
          • J Neurosci
          • 29:
          • 13809-13814
          • .
          Publication Description

          Seeing the image of a newscaster on a television set causes us to think that the sound coming from the loudspeaker is actually coming from the screen. How images capture sounds is mysterious because the brain uses different methods for determining the locations of visual versus auditory stimuli. The retina senses the locations of visual objects with respect to the eyes, whereas differences in sound characteristics across the ears indicate the locations of sound sources referenced to the head. Here, we tested which reference frame (RF) is used when vision recalibrates perceived sound locations. Visually guided biases in sound localization were induced in seven humans and two monkeys who made eye movements to auditory or audiovisual stimuli. On audiovisual (training) trials, the visual component of the targets was displaced laterally by 5-6 degrees. Interleaved auditory-only (probe) trials served to evaluate the effect of experience with mismatched visual stimuli on auditory localization. We found that the displaced visual stimuli induced ventriloquism aftereffect in both humans (approximately 50% of the displacement size) and monkeys (approximately 25%), but only for locations around the trained spatial region, showing that audiovisual recalibration can be spatially specific. We tested the reference frame in which the recalibration occurs. On probe trials, we varied eye position relative to the head to dissociate head- from eye-centered RFs. Results indicate that both humans and monkeys use a mixture of the two RFs, suggesting that the neural mechanisms involved in ventriloquism occur in brain region(s) using a hybrid RF for encoding spatial information.

          • OA Mullette-Gillman, YE Cohen and JM Groh.
          • 2009.
          • Motor-related signals in the intraparietal cortex encode locations in a hybrid, rather than eye-centered reference frame..
          • Cereb Cortex
          • 19:
          • 1761-1775
          • .
          Publication Description

          The reference frame used by intraparietal cortex neurons to encode locations is controversial. Many previous studies have suggested eye-centered coding, whereas we have reported that visual and auditory signals employ a hybrid reference frame (i.e., a combination of head- and eye-centered information) (Mullette-Gillman et al. 2005). One possible explanation for this discrepancy is that sensory-related activity, which we studied previously, is hybrid, whereas motor-related activity might be eye centered. Here, we examined the reference frame of visual and auditory saccade-related activity in the lateral and medial banks of the intraparietal sulcus (areas lateral intraparietal area [LIP] and medial intraparietal area [MIP]) of 2 rhesus monkeys. We recorded from 275 single neurons as monkeys performed visual and auditory saccades from different initial eye positions. We found that both visual and auditory signals reflected a hybrid of head- and eye-centered coordinates during both target and perisaccadic task periods rather than shifting to an eye-centered format as the saccade approached. This account differs from numerous previous recording studies. We suggest that the geometry of the receptive field sampling in prior studies was biased in favor of an eye-centered reference frame. Consequently, the overall hybrid nature of the reference frame was overlooked because the non-eye-centered response patterns were not fully characterized.

          • Mullette-Gillman, O. A., Cohen, Y. E. and Groh, JM..
          • 2009.
          • Motor-related signals in the intraparietal cortex encode locations in a hybrid, rather than eye-centered, reference frame.
          • Cerebral Cortex
          • 19:
          • 1761-1775
          • .
          Publication Description

          The reference frame used by intraparietal cortex neurons to encode locations is controversial. Many previous studies have suggested eye-centered coding, whereas we have reported that visual and auditory signals employ a hybrid reference frame (i.e., a combination of head- and eye-centered information) (Mullette-Gillman et al. 2005). One possible explanation for this discrepancy is that sensory-related activity, which we studied previously, is hybrid, whereas motor-related activity might be eye centered. Here, we examined the reference frame of visual and auditory saccade-related activity in the lateral and medial banks of the intraparietal sulcus (areas lateral intraparietal area [LIP] and medial intraparietal area [MIP]) of 2 rhesus monkeys. We recorded from 275 single neurons as monkeys performed visual and auditory saccades from different initial eye positions. We found that both visual and auditory signals reflected a hybrid of head- and eye-centered coordinates during both target and perisaccadic task periods rather than shifting to an eye-centered format as the saccade approached. This account differs from numerous previous recording studies. We suggest that the geometry of the receptive field sampling in prior studies was biased in favor of an eye-centered reference frame. Consequently, the overall hybrid nature of the reference frame was overlooked because the non-eye-centered response patterns were not fully characterized.

          • U Werner-Reiss and JM Groh.
          • 2008.
          • A rate code for sound azimuth in monkey auditory cortex: implications for human neuroimaging studies..
          • J Neurosci
          • 28:
          • 3747-3758
          • .
          Publication Description

          Is sound location represented in the auditory cortex of humans and monkeys? Human neuroimaging experiments have had only mixed success at demonstrating sound location sensitivity in primary auditory cortex. This is in apparent conflict with studies in monkeys and other animals, in which single-unit recording studies have found stronger evidence for spatial sensitivity. Does this apparent discrepancy reflect a difference between humans and animals, or does it reflect differences in the sensitivity of the methods used for assessing the representation of sound location? The sensitivity of imaging methods such as functional magnetic resonance imaging depends on the following two key aspects of the underlying neuronal population: (1) what kind of spatial sensitivity individual neurons exhibit and (2) whether neurons with similar response preferences are clustered within the brain. To address this question, we conducted a single-unit recording study in monkeys. We investigated the nature of spatial sensitivity in individual auditory cortical neurons to determine whether they have receptive fields (place code) or monotonic (rate code) sensitivity to sound azimuth. Second, we tested how strongly the population of neurons favors contralateral locations. We report here that the majority of neurons show predominantly monotonic azimuthal sensitivity, forming a rate code for sound azimuth, but that at the population level the degree of contralaterality is modest. This suggests that the weakness of the evidence for spatial sensitivity in human neuroimaging studies of auditory cortex may be attributable to limited lateralization at the population level, despite what may be considerable spatial sensitivity in individual neurons.

          • KK Porter, RR Metzger and JM Groh.
          • 2007.
          • Visual- and saccade-related signals in the primate inferior colliculus..
          • Proc Natl Acad Sci U S A
          • 104:
          • 17855-17860
          • .
          Publication Description

          The inferior colliculus (IC) is normally thought of as a predominantly auditory structure because of its early position in the ascending auditory pathway just before the auditory thalamus. Here, we show that a majority of IC neurons (64% of 180 neurons) in awake monkeys carry visual- and/or saccade-related signals in addition to their auditory responses (P

          Press coverage of this work has appeared in Scientific American (ScientificAmerican.com), Fox News (foxnews.com), the CBC radio program “Quirks and Quarks”, the Radio New Zealand program “Nights”, the Telegraph, the Italian science magazine “Newton”, and LiveScience.com and numerous other online science news web sites.

          • DA Bulkin and JM Groh.
          • 2006.
          • Seeing sounds: visual and auditory interactions in the brain..
          • Curr Opin Neurobiol
          • 16:
          • 415-419
          • .
          Publication Description

          Objects and events can often be detected by more than one sensory system. Interactions between sensory systems can offer numerous benefits for the accuracy and completeness of the perception. Recent studies involving visual-auditory interactions have highlighted the perceptual advantages of combining information from these two modalities and have suggested that predominantly unimodal brain regions play a role in multisensory processing.

          • KK Porter and JM Groh.
          • 2006.
          • Chapter 18 The "other" transformation required for visual-auditory integration: representational format.
          • Progress in Brain Research
          • 155 B:
          • 313-323
          • .
          Publication Description

          Multisensory integration of spatial signals requires not only that stimulus locations be encoded in the same spatial reference frame, but also that stimulus locations be encoded in the same representational format. Previous studies have addressed the issue of spatial reference frame, but representational format, particularly for sound location, has been relatively overlooked. We discuss here our recent findings that sound location in the primate inferior colliculus is encoded using a "rate" code, a format that differs from the place code used for representing visual stimulus locations. Possible mechanisms for transforming signals from rate-to-place or place-to-rate coding formats are considered. © 2006 Elsevier B.V. All rights reserved.

          • KK Porter and JM Groh.
          • 2006.
          • The "other" transformation required for visual-auditory integration: representational format.
          • VISUAL PERCEPTION, PT 2: FUNDAMENTALS OF AWARENESS: MULTI-SENSORY INTEGRATION AND HIGH-ORDER PERCEPTION
          • 155:
          • 313-323
          • .
          • RR Metzger, NT Greene, KK Porter and JM Groh.
          • 2006.
          • Effects of reward and behavioral context on neural activity in the primate inferior colliculus.
          • Journal of Neuroscience
          • 26:
          • 7468-7476
          • .
          Publication Description

          Neural activity in the inferior colliculus (IC) likely plays an integral role in the processing of various auditory parameters, such as sound location and frequency. However, little is known about the extent to which IC neural activity may be influenced by the context in which sounds are presented. In this study, we examined neural activity of IC neurons in the rhesus monkey during an auditory task in which a sound served as a localization target for a saccade. Correct performance was rewarded, and the magnitude of the reward was varied in some experiments. Neural activity was also assessed during a task in which the monkey maintained fixation of a light while ignoring the sound, as well as when sounds were presented in the absence of any task. We report that neural activity increased late in the trial in the saccade task in 58% of neurons and that the level of activity throughout the trials could be modulated by reward magnitude for many neurons. The late-trial neural activity similarly increased in the fixation task in 39% of the neurons tested for this task but was not observed when sounds were presented in the absence of a behavioral task and reward. Together, these results suggest that a reward-related signal influences neural activity in the IC. Copyright © 2006 Society for Neuroscience.

          • KK Porter, RR Metzger and JM Groh.
          • 2006.
          • Representation of eye position in primate inferior colliculus.
          • Journal of Neurophysiology
          • 95:
          • 1826-1842
          • .
          Publication Description

          We studied the representation of eye-position information in the primate inferior colliculus (IC). Monkeys fixated visual stimuli at one of eight or nine locations along the horizontal meridian between -24 and 24° while sounds were presented from loudspeakers at locations within that same range. Approximately 40% of our sample of 153 neurons showed statistically significant sensitivity to eye position during either the presentation of an auditory stimulus or in the absence of sound (Bonferroni corrected P < 0.05). The representation for eye position was predominantly monotonic and favored contralateral eye positions. Eye-position sensitivity was more prevalent among neurons without sound-location sensitivity: about half of neurons that were insensitive to sound location were sensitive to eye position, whereas only about one-quarter of sound-location-sensitive neurons were also sensitive to eye position. Our findings suggest that sound location and eye position are encoded using independent but overlapping rate codes at the level of the IC. The use of a common format has computational advantages for integrating these two signals. The differential distribution of eye-position sensitivity and sound-location sensitivity suggests that this process has begun by the level of the IC but is not yet complete at this stage. We discuss how these signals might fit into Groh and Sparks' vector subtraction model for coordinate transformations. Copyright © 2006 The American Physiological Society.

          • U Werner-Reiss, KK Porter, AM Underhill and JM Groh.
          • 2006.
          • Long lasting attenuation by prior sounds in auditory cortex of awake primates.
          • Experimental Brain Research
          • 168:
          • 272-276
          • .
          Publication Description

          How the brain responds to sequences of sounds is a question of great relevance to a variety of auditory perceptual phenomena. We investigated how long the responses of neurons in the primary auditory cortex of awake monkeys are influenced by the previous sound. We found that responses to the second sound of a two-sound sequence were generally attenuated compared to the response that sound evoked when it was presented first. The attenuation remained evident at the population level even out to inter-stimulus intervals (ISIs) of 5 s, although it was of modest size for ISIs > 2 s. Behavioral context (performance versus non-performance of a visual fixation task during sound presentation) did not influence the results. The long time course of the first sound's influence suggests that, under natural conditions, neural responses in auditory cortex are rarely governed solely by the current sound. © Springer-Verlag 2005.

          • OA Mullette-Gillman, YE Cohen and JM Groh.
          • 2005.
          • Eye-centered, head-centered, and complex coding of visual and auditory targets in the intraparietal sulcus.
          • Journal of Neurophysiology
          • 94:
          • 2331-2352
          • .
          Publication Description

          The integration of visual and auditory events is thought to require a joint representation of visual and auditory space in a common reference frame. We investigated the coding of visual and auditory space in the lateral and medial intraparietal areas (LIP, MIP) as a candidate for such a representation. We recorded the activity of 275 neurons in LIP and MIP of two monkeys while they performed saccades to a row of visual and auditory targets from three different eye positions. We found 45% of these neurons to be modulated by the locations of visual targets, 19% by auditory targets, and 9% by both visual and auditory targets. The reference frame for both visual and auditory receptive fields ranged along a continuum between eye- and head-centered reference frames with ∼10% of auditory and 33% of visual neurons having receptive fields that were more consistent with an eye- than a head-centered frame of reference and 23 and 18% having receptive fields that were more consistent with a head- than an eye-centered frame of reference, leaving a large fraction of both visual and auditory response patterns inconsistent with both head- and eye-centered reference frames. The results were similar to the reference frame we have previously found for auditory stimuli in the inferior colliculus and core auditory cortex. The correspondence between the visual and auditory receptive fields of individual neurons was weak. Nevertheless, the visual and auditory responses were sufficiently well correlated that a simple one-layer network constructed to calculate target location from the activity of the neurons in our sample performed successfully for auditory targets even though the weights were fit based only on the visual responses. We interpret these results as suggesting that although the representations of space in areas LIP and MIP are not easily described within the conventional conceptual framework of reference frames, they nevertheless process visual and auditory spatial information in a similar fashion. Copyright © 2005 The American Physiological Society.

          • RR Metzger, OA Mullette-Gillman, AM Underhill, YE Cohen and JM Groh.
          • 2004.
          • Auditory saccades from different eye positions in the monkey: Implications for coordinate transformations.
          • Journal of Neurophysiology
          • 92:
          • 2622-2627
          • .
          Publication Description

          Auditory spatial information arises in a head-centered coordinate frame, whereas the saccade command signals generated by the superior colliculus (SC) are thought to specify target locations in an eye-centered frame. However, auditory activity in the SC appears to be neither head- nor eye-centered but in a reference frame that is intermediate between both of these reference frames. This neurophysiological finding suggests that auditory saccades might not fully compensate for changes in initial eye position. Here, we investigated whether the accuracy of saccades to sounds is affected by initial eye position in rhesus monkeys. We found that, on average, a 12° horizontal shift in initial eye position produced only a 0.6 to 1.6° horizontal shift in the endpoints of auditory saccades made to targets at a range of locations along the horizontal meridian. This shift was similar in size to the modest influence of eye position on visual saccades. This virtually complete compensation for initial eye position implies that auditory activity in the SC is read out in a manner that is appropriate for generating accurate saccades to sounds.

          • U Werner-Reiss, KA Kelly, AS Trause, AM Underhill and JM Groh.
          • 2003.
          • Eye position affects activity in primary auditory cortex of primates.
          • Current Biology
          • 13:
          • 554-562
          • .
          Publication Description

          Background: Neurons in primary auditory cortex are known to be sensitive to the locations of sounds in space, but the reference frame for this spatial sensitivity has not been investigated. Conventional wisdom holds that the auditory and visual pathways employ different reference frames, with the auditory pathway using a head-centered reference frame and the visual pathway using an eye-centered reference frame. Reconciling these discrepant reference frames is therefore a critical component of multisensory integration. Results: We tested the reference frame of neurons in the auditory cortex of primates trained to fixate visual stimuli at different orbital positions. We found that eye position altered the activity of about one third of the neurons in this region (35 of 113, or 31%). Eye position affected not only the responses to sounds (26 of 113, or 23%), but also the spontaneous activity (14 of 113, or 12%). Such effects were also evident when monkeys moved their eyes freely in the dark. Eye position and sound location interacted to produce a representation for auditory space that was neither head- nor eye-centered in reference frame. Conclusions: Taken together with emerging results in both visual and other auditory areas, these findings suggest that neurons whose responses reflect complex interactions between stimulus position and eye position set the stage for the eventual convergence of auditory and visual information.

          • JM Groh and MS Gazzaniga.
          • 2003.
          • How the brain keeps time.
          • Daedalus
          • 132:
          • 56-61
          • .
          • JM Groh, KA Kelly and AM Underhill.
          • 2003.
          • A Monotonic Code for Sound Azimuth in Primate Inferior Colliculus.
          • Journal of Cognitive Neuroscience
          • 15:
          • 1217-1231
          • .
          Publication Description

          We investigated the format of the code for sound location in the inferior colliculi of three awake monkeys (Macaca mulatta). We found that roughly half of our sample of 99 neurons was sensitive to the free-field locations of broadband noise presented in the frontal hemisphere. Such neurons nearly always responded monotonically as a function of sound azimuth, with stronger responses for more contralateral sound locations. Few, if any, neurons had circumscribed receptive fields. Spatial sensitivity was broad: the proportion of the total sample of neurons responding to a sound at a given location ranged from 30% for ipsilateral locations to 8096 for contralateral locations. These findings suggest that sound azimuth is represented via a population rate code of very broadly responsive neurons in primate inferior colliculi. This representation differs in format from the place code used for encoding the locations of visual and tactile stimuli and poses problems for the eventual convergence of auditory and visual or somatosensory signals. Accordingly, models for converting this representation into a place code are discussed.

          • JM Groh, AS Trause, AM Underhill, KR Clark and S Inati.
          • 2001.
          • Eye position influences auditory responses in primate inferior colliculus.
          • Neuron
          • 29:
          • 509-518
          • .
          Publication Description

          We examined the frame of reference of auditory responses in the inferior colliculus in monkeys fixating visual stimuli at different locations. Eye position modulated the level of auditory responses in 33% of the neurons we encountered, but it did not appear to shift their spatial tuning. The effect of eye position on auditory responses was substantial - comparable in magnitude to that of sound location. The eye position signal appeared to interact with the auditory responses in at least a partly multiplicative fashion. We conclude that the representation of sound location in primate IC is distributed and that the frame of reference is intermediate between head- and eye-centered coordinates. The information contained in these neurons appears to be sufficient for later neural stages to calculate the positions of sounds with respect to the eyes.

          • L Boucher, JM Groh and HC Hughes.
          • 2001.
          • Visual latency and the mislocalization of perisaccadic stimuli.
          • Vision Research
          • 41:
          • 2631-2644
          • .
          Publication Description

          Determining the precise moment a visual stimulus appears is difficult because visual response latencies vary. This temporal uncertainty could cause localization errors to brief visual targets presented before and during eye movements if the oculomotor system cannot determine the position of the eye at the time the stimulus appeared. We investigated the effect of varying neural processing time on localization accuracy for perisaccadic visual targets that differed in luminance. Although systematic errors in localization were observed, the effect of luminance was surprisingly small. We explore several hypotheses that may explain why processing delays are not more disruptive to localization performance. © 2001 Elsevier Science Ltd. All rights reserved.

          • JM Groh.
          • 2001.
          • Converting neural signals from place codes to rate codes.
          • Biological Cybernetics
          • 85:
          • 159-165
          • .
          Publication Description

          The nervous system uses two basic types of formats for encoding information. The parameters of many sensory (and some premotor) signals are represented by the pattern of activity among an array of neurons each of which is optimally responsive to a different parameter value. This type of code is commonly referred to as a place code. Motor commands, in contrast, use rate coding: the desired force of a muscle is specified as a monotonic function of the aggregate rate of discharge across all of its motor neurons. Generating movements based on sensory information often requires converting signals from a place code to a rate code. In this paper I discuss three possible models for how the brain does this.

          • L Boucher, JM Groh and HC Hughes.
          • 2001.
          • Afferent delays and the mislocalization of perisaccadic stimuli.
          • Vision Research
          • 41:
          • 2631-2644
          • .
          Publication Description

          Determining the precise moment a visual stimulus appears is difficult because visual response latencies vary. This temporal uncertainty could cause localization errors to brief visual targets presented before and during eye movements if the oculomotor system cannot determine the position of the eye at the time the stimulus appeared. We investigated the effect of varying neural processing time on localization accuracy for perisaccadic visual targets that differed in luminance. Although systematic errors in localization were observed, the effect of luminance was surprisingly small. We explore several hypotheses that may explain why processing delays are not more disruptive to localization performance. © 2001 Elsevier Science Ltd. All rights reserved.

          • JM Groh.
          • 2000.
          • Predicting perception from population codes.
          • Nature Neuroscience
          • 3:
          • 201-202
          • .
          Publication Description

          Treue and colleagues use electrophysiological recordings in monkeys and psychophysical experiments in humans to suggest that the shape of a population response in a motion sensitive region of the brain (area MT), rather than the peak of the response, determines motion perception.

          • RT Born, JM Groh, R Zhao and SJ Lukasewycz.
          • 2000.
          • Segregation of object and background motion in visual area MT: Effects of microstimulation on eye movements.
          • Neuron
          • 26:
          • 725-734
          • .
          Publication Description

          To track a moving object, its motion must first be distinguished from that of the background. The centar-surround properties of neurons in the middle temporal visual area (MT) may be important for signaling the relative motion between object and background. To test this, we microstimulated within MT and measured the effects on monkeys' eye movements to moving targets. We found that stimulation at 'local motion' sites, where receptive fields possessed antagonistic surrounds, shifted pursuit in the preferred direction of the neurons, whereas stimulation at 'wide-field motion' sites shifted pursuit in the opposite, or null, direction. We propose that activating wide-field sites simulated background motion, thus inducing a target motion signal in the opposite direction. Our results support the hypothesis that neuronal center-surround mechanisms contribute to the behavioral segregation of objects from the background.

          • I Wickersham and JM Groh.
          • 1998.
          • Electrically evoking sensory experience.
          • Current Biology
          • 8:
          • R412-R414
          • .
          Publication Description

          Monkeys trained to distinguish touch stimuli that 'flutter' with different frequencies can similarly distinguish electrical stimulation of the somatosensory cortex according to its frequency; the implication is that the electrically-evoked patterns of cortical activity cause flutter sensations similar to those induced by touch.

          • JM Groh.
          • 1998.
          • Reading neural representations.
          • Neuron
          • 21:
          • 661-664
          • .
          • I Wickersham and JM Groh.
          • 1998.
          • Neurophysiology: Electrically evoking sensory experience.
          • Current Biology
          • 8:
          • R412-R414
          • .
          Publication Description

          Monkeys trained to distinguish touch stimuli that 'flutter' with different frequencies can similarly distinguish electrical stimulation of the somatosensory cortex according to its frequency; the implication is that the electrically-evoked patterns of cortical activity cause flutter sensations similar to those induced by touch.

          • JM Groh, RT Born and WT Newsome.
          • 1997.
          • How is a sensory map read out? Effects of microstimulation in visual area MT on saccades and smooth pursuit eye movements.
          • Journal of Neuroscience
          • 17:
          • 4312-4330
          • .
          Publication Description

          To generate behavioral responses based on sensory input, motor areas of the brain must interpret, or 'read out,' signals from sensory maps. Our experiments tested several algorithms for how the motor systems for smooth pursuit and saccadic eye movements might extract a usable signal of target velocity from the distributed representation of velocity in the middle temporal visual area (MT or V5). Using microstimulation, we attempted to manipulate the velocity information within MT while monkeys tracked a moving visual stimulus. We examined the effects of the microstimulation on smooth pursuit and on the compensation for target velocity shown by saccadic eye movements. Microstimulation could alter both the speed and direction of the motion estimates of both types of eye movements and could also cause monkeys to generate pursuit even when the visual target was actually stationary. The pattern of alterations suggests that microstimulation can introduce an additional velocity signal into MT and that the pursuit and saccadic systems usually compute a vector average of the visually evoked and microstimulation- induced velocity signals (pursuit, 55 of 122 experiments; saccades, 70 of 122). Microstimulation effects in a few experiments were consistent with vector summation of these two signals (pursuit, 6 of 122; saccades, 2 of 122). In the remainder of the experiments, microstimulation caused either an apparent impairment in motion processing (pursuit, 47 of 122; saccades, 41 of 122) or had no effect (pursuit, 14 of 122; saccades, 9 of 122). Within individual experiments, the effects on pursuit and saccades were usually similar, but the occasional striking exception suggests that the two eye movement systems may perform motion computations somewhat independently.

          • JM Groh, E Seidemann and WT Newsome.
          • 1996.
          • Neurophysiology: Neural fingerprints of visual attention.
          • Current Biology
          • 6:
          • 1406-1409
          • .
          Publication Description

          Pronounced effects of attention have been demonstrated in a region of visual cortex previously thought to be devoid of such influences; identifying the features critical for eliciting these effects should teach us a great deal about the neural underpinnings of visual attention.

          • JM Groh, E Seidemann and WT Newsome.
          • 1996.
          • Neural fingerprints of visual attention.
          • Current Biol.
          • 11:
          • 1406-1409
          • .
          • JM Groh, RT Born and WT Newsome.
          • 1996.
          • Interpreting sensory maps in visual cortex.
          • IBRO News
          • 24:
          • 11-12
          • .
          • JM Groh and DL Sparks.
          • 1996.
          • Saccades to somatosensory targets. III. Eye-position-dependent somatosensory activity in primate superior colliculus.
          • Journal of Neurophysiology
          • 75:
          • 439-453
          • .
          Publication Description

          1. We recorded from cells with sensory responses to somatosensory stimuli in the superior colliculus (SC) of awake monkeys. Our goal was to determine the frame of reference of collicular somatosensory signals by seeing whether the positions of the eyes influenced the responses of cells to a given tactile stimulus. Somatosensory targets consisted of vibrotactile stimuli delivered to the hands, which were held in fixed spatial positions. Monkeys performed a delayed saccade task from different initial fixation positions to the locations of these tactile stimuli or to visual stimuli at approximately the same location. 2. The responses of a majority of somatosensory cells (25 of 34 or 74%) were significantly affected by eye position. Nearly all somatosensory cells also responded to visual targets (28 of 30, 93%). Cells whose somatosensory responses depended on eye position responded to visual and somatosensory targets located at approximately the same direction in space with respect to the eyes. 3. The activity of these cells exhibited both sensory and motor qualities. The discharge was more closely linked in time to stimulus onset than to the movement. Sensory features of the stimulus were reflected in the responses: the discharge of a number of cells was phase-locked to the pulses of vibration. The sensory responses occurred even if the animal's next saccade was not directed into the response field of the cell. However, two thirds of the cells also exhibited a burst of motor activity in conjunction with the saccade to the somatosensory target. Sensory and motor activity were not always spatially coextensive. When different, the tuning of motor activity was broader. 4. Cells with somatosensory responses to vibratory stimulation of the hands were found in a wide region of the SC, spanning a 40° range of movement amplitudes. 5. These data show that somatosensory signals in the SC are not purely somatotopic but are dependent on eye position. For stimuli at a fixed location, this eye position dependence allows somatosensory and visual signals to be in register and share a premotor circuitry for guiding saccadic eye movements. 6. The dependence of the somatosensory responses on eye position suggests that the somatosensory receptive fields may either shift on the body surface or they may be restricted to a limited region of the body surface but be gated by eye (and body) position. Future experiments varying body position and the location of the stimulus on the body surface are needed to determine which of these alternatives is correct. Cells with either type of receptive field could provide an unambiguous signal of the location of somatosensory saccade targets with respect to the eyes. The transformation of somatosensory signals from a body-centered frame of reference to a frame of reference that depends on the position of the stimulus with respect to the eyes is necessary for the correct activation of collicular neurons with motor activity, because this activity encodes saccades as desired changes in eye position.

          • JM Groh and DL Sparks.
          • 1996.
          • Saccades to somatosensory targets. II. Motor convergence in primate superior colliculus.
          • Journal of Neurophysiology
          • 75:
          • 428-438
          • .
          Publication Description

          1. We examined cells with saccade-related activity in the superior colliculus (SC) of monkeys performing saccades to both somatosensory and visual targets. Our goals were 1) to determine whether signals from these separate sensory systems have converged onto a common motor pathway by the level of the SC; 2) to determine the frame of reference of somatosensory saccade signals in the SC; and 3) to relate collicular motor activity to the behavioral characteristics of somatosensory saccades. 2. Somatosensory targets consisted of vibrotactile stimuli delivered to the hands, which were held in fixed spatial positions. Saccades of different directions and amplitudes were elicited from different initial eye positions. Of 86 cells with motor-related activity. 85 (99%) discharged for saccades to both visual and somatosensory targets. The remaining cell was active only for visual saccades. 3. Cells with saccade-related activity had movement fields representing the direction and amplitude of saccades to both visual arid somatosensory targets. We found no cells that discharged for saccades to a particular somatosensory target regardless of the vector of the saccade. 4. Small modality-dependent differences in the spatial tuning of the movement fields were observed, but these variations formed no clear pattern. Given the large population of cells active in conjunction with each saccade, these small tuning differences may have no net effect. Because the visual and somatosensory movement fields of individual cells were similar to each other, the inaccuracy of somatosensory saccades is likely to he the result of inaccurate signals reaching the SC, rather than an error signal added downstream. 5. The peak discharge frequency of collicular motor cells was lower for somatosensory saccades than for visual saccades, although the number of spikes in the discharge was about the same. 6. The latency of the onset of the prelude of motor activity following the cue to initiate a saccade was about the same for somatosensory and visual trials, even though somatosensory saccades have longer reaction times than visual saccades. However, the peak of the motor activity was delayed on somatosensory trials such that the timing of the peak was the same with respect to the movement on somatosensory and visual trials. 7. We conclude that the same population of saccade-related neurons in the SC that represents saccades to visual targets also represents saccades to somatosensory targets. Somatosensory saccades are encoded by these cells as the change in eye position necessary to bring the target onto the fovea, rather than the location of the stimulus on the body surface. Modality-dependent differences in the frequency and timing of collicular motor activity may contribute to velocity and reaction time differences.

          • JM Groh and DL Sparks.
          • 1996.
          • Saccades to somatosensory targets. I. Behavioral characteristics.
          • Journal of Neurophysiology
          • 75:
          • 412-427
          • .
          Publication Description

          1. We compared the properties of saccades to somatosensory and visual targets. This comparison provides insight into the translation of sensory signals coding target location in different sensory coordinate frameworks into motor commands of a common format. Vibrotactile stimuli were delivered to the hands, which were fixed in position and concealed beneath a barrier. Saccades of different directions and amplitudes were elicited by the same somatosensory target from different initial eye positions. Both monkeys and humans served as subjects. 2. Somatosensory saccades were less accurate than visual saccades in both humans and monkeys. When the barrier concealing the hands was removed, somatosensory saccade accuracy improved. While the hands were concealed, the visual frame of reference provided by room illumination did not greatly affect saccade accuracy: accuracy was not degraded in complete darkness for two of three monkeys. 3. The endpoints of saccades to a single somatosensory target varied with initial eye position for the monkeys, but not for the human subjects. 4. We also found evidence of an effect of limb position on somatosensory saccades: when human subjects performed the task with crossed hands, the incidence of curved saccades increased. Saccades often began in the direction of the unstimulated hand and curved markedly toward the stimulated hand. When one subject was required to delay the saccade by 600-1,000 ms after target onset (the delayed saccade task), the saccades were straight. Somatosensory saccades were also straight when the hands were not crossed. 5. The reaction times of somatosensory saccades were longer than the reaction times of visual saccades, and they decreased as a function of saccade amplitude. The delayed saccade task reduced the differences between somatosensory and visual saccade reaction times. The reaction times of saccades to very dim visual targets increased into the range found for saccades to somatosensory targets. When the saccade target was the combination of the somatosensory and visual stimuli at the same location, the reaction time was slightly lower than for visual targets alone. 6. The peak velocities of somatosensory saccades were lower than those of visual saccades of the same amplitude. The velocities of saccades to combined somatosensory and visual targets were indistinguishable from those of saccades to visual targets alone. The differences between somatosensory and visual saccade velocity were maintained in the delayed trial type. These differences suggest that the main sequence or velocity-amplitude relationship characteristic of saccades depends on the modality of the target. 7. The implications of these modality dependent differences in accuracy, reaction time, and saccade velocity are discussed with regard to models of the saccade generator and the coordinate transformation necessary for somatosensory saccades.

          • JM Groh and DL Sparks.
          • 1992.
          • Two models for transforming auditory signals from head-centered to eye-centered coordinates.
          • Biological Cybernetics
          • 67:
          • 291-302
          • .
          Publication Description

          Two models for transforming auditory signals from head-centered to eye-centered coordinates are presented. The vector subtraction model subtracts a rate-coded eye position signal from a topographically weighted auditory target position signal to produce a rate-code of target location with respect to the eye. The rate-code is converted into a place-code through a graded synaptic weighting scheme and inhibition. The dendrite model performs a mapping of head-centered auditory space onto the dendrites of eye-centered units. Individual dendrites serve as logical comparators of target location and eye position. Both models produce a topographic map of auditory space in eye-centered coordinates like that found in the primate superior colliculus. Either type can be converted into a model for transforming visual signals from retinal to head-centered coordinates. © 1992 Springer-Verlag.

      • Books

          • JM Groh.
          • 2014.
          • Making Space: How the Brain Knows Where Things Are.
          • Harvard University Press.
          Publication Description

          The book is about how the brain creates our sense of spatial location from a variety of sensory and motor sources, and how this spatial sense in turn shapes our cognitive abilities. Knowing where things are is effortless. But “under the hood”, your brain devotes a tremendous amount of computational power to figuring out even the simplest of details about the world around you and your position in it. Recognizing your mother, finding your phone, going to the grocery store, playing the banjo – these require careful sleuthing and coordination across different sensory and motor domains. The book traces the brain’s detective work to create this sense of space and argues that the brain’s spatial focus permeates our cognitive abilities, affecting the way we think and remember. The book begins by tracing the link from patterns of light in the world to the brain's deductions regarding objects, boundaries, and visual space in three dimensions. Later chapters outline similar neural deductions regarding sound location, body posture, balance, and movement, and how such information is synthesized to allow us to navigate through space. The final two chapters of the book consider how the brain's spatial representations do "double duty" to aid in memory and other types of mental activity such as thinking and reasoning.

      • Chapters in Books

          • JM Groh and DK Pai.
          • 2010.
          • Looking at Sounds: Neural Mechanisms in the Primate Brain.
          • scopus
          • .
          Publication Description

          © 2010 by Michael L. Platt and Asif A. Ghazanfar. All rights reserved.When you hear a salient sound, it is natural to look at it to find out what is happening. Orienting the eyes to look at sounds is essential to our ability to identify and understand the events occurring in our environment. This behavior involves both sensorimotor and multisensory integration: a sound elicits a movement of the visual sense organ, the eye, to bring the source of the sound under visual scrutiny. How are auditory signals converted into oculomotor commands? This chapter describes recent work concerning the necessary computational steps between sound and eye movement, and how they may be implemented in neural populations in the primate brain.

          • Groh, JM and Pai, D.
          • 2008.
          • Looking at sounds: neural mechanisms in the primate brain..
          • .
          Publication Description

          When you hear a salient sound, it is natural to look at it to find out what is happening. Orienting the eyes to look at sounds is essential to our ability to identify and understand the events occurring in our environment. This behavior involves both sensorimotor and multisensory integration: a sound elicits a movement of the visual sense organ, the eye, to bring the source of the sound under visual scrutiny. How are auditory signals converted into oculomotor commands? This chapter describes our recent work concerning the necessary computational steps between sound and eye movement, and how they may be implemented in neural populations in the primate brain.

          • Kelly, KA, Metzger, RR, Mullette-Gillman, OA., Werner-Reiss U., Groh, JM.
          • 2003.
          • Representation of sound location in the primate brain.
          • .
          • Groh, JM and Werner-Reiss, U.
          • 2002.
          • Visual and auditory integration.
          • .
          • Sparks, DL and Groh, JM.
          • 1995.
          • The superior colliculus: a window to problems in integrative neuroscience.
          • .
      • Commentaries/Book Reviews

          • J.M. Groh.
          • 2011.
          • Book Review: The Tell Tale Brain..
          • Journal of Clinical Investigation
          • 121:
          • 2953
          • .
      • Other

          • J.M. Groh, O. A. Mullette-Gillman, Y. E. Cohen.
          • 2006.
          • Auditory and visual reference frames in the intraparietal sulcus.
          • Cosyne workshop
          • .
          • J.M. Groh.
          • 2006.
          • Hybrid reference frames: why?.
          • Neural Control of Movement.
          • .
          • Werner-Reiss, U, Greene, NT, Underhill, AM, Metzger, RR, Groh, JM.
          • 2005.
          • The representation of sound frequency in the primate inferior colliculus.
          • Association for Research in Otolaryngology Abstr.
          • .
          • Werner-Reiss, U, Porter, KK, Greene, NT, Larue, DT, Winer, JA and Groh, JM.
          • 2005.
          • Eye position signals are distributed throughout the primate inferior colliculus.
          • Soc. Neurosci. Abstr.
          • .
          • Porter, KK, Metzger, RR, Werner-Reiss, U, Underhill, AM, Groh, JM.
          • 2005.
          • Visual responses in auditory neurons of the primate inferior colliculus.
          • Soc. Neurosci. Abstr.
          • .
          • Mullette-Gillman, OA; Cohen, YE; Groh, JM.
          • 2004.
          • Reference frame of auditory and visual signals in bimodal neurons of the primate lateral intraparietal area (LIP).
          • Soc. Neurosci. Abstr.
          • .
          • Metzger, RR, Kelly, KA, Groh, JM.
          • 2004.
          • Sensitivity to eye position in the inferior colliculus of the monkey during an auditory saccade task.
          • Soc. Neurosci. Abstr.
          • .
          • Werner-Reiss, U, Underhill, AM, and Groh, JM.
          • 2004.
          • The representation of auditory space in core auditory cortex of primates maintaining fixation.
          • Soc. Neurosci. Abstr.
          • .
          • Kelly, KA, Werner-Reiss, U, Underhill, AM and Groh, JM.
          • 2003.
          • Eye position signals change shape along the primate auditory pathway.
          • Soc. Neurosci. Abstr.
          • .
          • Metzger, RR, Mullette-Gillman, OA, Underhill, AM, Cohen, YE and Groh, JM.
          • 2003.
          • Effect of initial eye position on saccades to auditory targets in monkeys.
          • Soc. Neurosci. Abstr.
          • .
          • Mullette-Gillman, OA, Cohen, YE and Groh, JM.
          • 2003.
          • Similar eye position influences on auditory and visual responses in the lateral intraparietal area, LIP, of primates.
          • Soc. Neurosci. Abstr.
          • .
          • Werner-Reiss, U, Kelly, KA, Underhill, AM and Groh, JM.
          • 2003.
          • Long inter-stimulus intervals affect responses in primate auditory cortex.
          • Soc. Neurosci. Abstr.
          • .
          • Kelly KA, Werner-Reiss U, Underhill AM, and Groh JM.
          • 2002.
          • Eye position affects a wide range of auditory cortical neurons in primates.
          • Soc. Neurosci. Abstr.
          • 845.1:
          • .
          • Mullette-Gillman OA, Cohen YE, and Groh JM.
          • 2002.
          • Assessing the spatial alignment of auditory and visual responses in the inferior parietal sulcus.
          • Soc. Neurosci. Abstr.
          • 57.19
          • .
          • Metzger RR and Groh JM.
          • 2002.
          • Linking primate inferior colliculus neural activity to sound localization performance.
          • Soc. Neurosci. Abstr.
          • 845.2
          • .
          • Groh, JM, Underhill, AM.
          • 2001.
          • Coding of sound location in primate inferior colliculus.
          • Soc. Neurosci. Abstr.
          • 27:
          • 60.1
          • .
          • Metzger, R R, Underhill, AM and Groh, JM.
          • 2001.
          • Time course of eye position influence in primate inferior colliculus.
          • Soc. Neurosci. Abstr.
          • 27:
          • 60.3
          • .
          • Werner-Reiss, U, Kelly, KA, Underhill, AM and Groh, JM.
          • 2001.
          • Eye position tuning in primate auditory cortex.
          • Soc. Neurosci. Abstr.
          • 27:
          • 60.2
          • .
          • Trause, AS, Werner-Reiss, U, Underhill, AM, Groh, JM.
          • 2000.
          • Effects of eye position on auditory signals in primate auditory cortex.
          • Soc. Neurosci. Abstr.
          • 26:
          • 1977
          • .
          • Clark, KR, Trause, AS, Underhill, AM, Groh, JM.
          • 2000.
          • Effects of eye position on auditory signals in primate inferior colliculus.
          • Soc. Neurosci. Abstr.
          • 26:
          • 1977
          • .
          • Boucher, L, Groh JM, Hughes, HC.
          • 2000.
          • Oculomotor localization of perisaccadic auditory targets.
          • Soc. Neurosci. Abstr.
          • 26:
          • 1329
          • .
          • Boucher, L, Groh, JM, and Hughes, HC.
          • 1999.
          • Contributions of visual processing delays to mislocalization of perisaccadic stimuli.
          • Soc. Neurosci. Abstr.
          • 29:
          • .
          • Born, RT, Zhao, R, and Lukasewycz, SJ, Groh, JM.
          • 1999.
          • Representation of figure and ground in visual area MT.
          • Soc. Neurosci. Abstr.
          • .
          • Groh, JM.
          • 1997.
          • A model for transforming signals from a place code to a rate code.
          • Soc. Neurosci. Abstr.
          • 23:
          • 1560
          • .
          • JM Groh, RT Born and WT Newsome.
          • 1996.
          • A comparison of the effects of microstimulation in area MT on saccades and smooth pursuit eye movements.
          • Investigative Ophthalmology and Visual Science
          • 37:
          • S472
          • .
          Publication Description

          Purpose. Tracking a moving target that appears in the periphery involves two kinds of eye movements, saccades and smooth pursuit. Both kinds of movements require a signal of target velocity. The saccadic system must use a target velocity signal to compensate for the motion of the target, and the pursuit system must use such a signal to match eye velocity to target velocity. We conducted a microstimulation experiment in two rhesus monkeys to compare the contribution of motion signals in MT to saccades and pursuit. Methods. We stimulated extrafoveal MT while monkeys performed a tracking task in which a target appeared in the multiunit receptive field of the site and moved either in the preferred direction of the site, the null direction, or not at all (saccade only trials). We stimulated (20-80 μamps, 200 Hz) from target onset until the monkey made a saccade to the target. We examined the effects of microstimulation on the endpoint of the saccade and on the average eye velocity over the first 60 ms of pursuit after the saccade. Results. Stimulation altered the velocity of pursuit in 71 out of 123 experiments, and altered the endpoints of saccades in 78 experiments. Both types of effects occurred in 50 experiments. The effects on saccades were quantified in terms of the velocity compensation, or the target velocity for which the saccade would be appropriate. For both pursuit and saccades, the effects of microstimulation were suggestive of vector averaging of the visual target and an electrically induced velocity signal. The nature of the effects on saccades and pursuit within a given experiment could be quite dissimilar, but across the whole population of experiments, the two types of effects were weakly correlated. Conclusions. The weak correlation between effects on pursuit and saccades suggests that the pathways from MT to the pursuit and saccade systems overlap to some extent, but the fact that the effects can occur independently and can be dissimilar to each other suggests a surprising degree of segregation of these pathways.

          • Groh, JM, Born, RT, and Newsome, WT.
          • 1995.
          • Microstimulation of area MT affects both saccades and smooth pursuit eye movements.
          • Soc. Neurosci. Abstr.
          • 21:
          • 281
          • .
          • Born, RT, Groh, JM, and Newsome, WT.
          • 1995.
          • Functional architecture of primate area MT probed with microstimulation: effects on eye movements.
          • Soc. Neurosci. Abstr.
          • 21:
          • 281
          • .
          • Shadlen, MN, Groh, JM, Salzman, CD and Newsome, WT.
          • 1994.
          • . Responses of LIP neurons during a motion discrimination task: a decision process in action?.
          • Soc. Neurosci. Abstr.
          • 20:
          • 1279
          • .
          • Groh, JM.
          • 1993.
          • Coordinate transformations, sensorimotor integration, and the neural basis of saccades to somatosensory targets.
          • .
          • Groh, JM and Sparks, DL.
          • 1993.
          • Motor activity in the primate superior colliculus (SC) during saccades to somatosensory and visual targets.
          • Invest. Ophthal. Vis.. Sci.
          • 34:
          • 1137
          • .
          • Glimcher, PW, Groh, JM and Sparks, DL.
          • 1993.
          • Low-frequency collicular stimulation specifies saccadic amplitude gradually.
          • Invest. Ophthal. Vis.. Sci.
          • 34:
          • 1137
          • .
          • Groh, JM and Sparks, DL.
          • 1993.
          • Somatosensory activity in the superior colliculus (SC) influenced by eye position.
          • Soc. Neurosci. Abstr.
          • 19:
          • 858
          • .
          • Groh, JM and Sparks, DL.
          • 1992.
          • Characteristics of saccades to somatosensory targets.
          • Soc. Neurosci. Abstr.
          • 18:
          • 701
          • .
          • Groh, JM and Sparks, DL.
          • 1991.
          • A model for transforming auditory signals from head-centered to eye-centered coordinates.
          • Soc. Neurosci. Abstr.
          • 17:
          • 458
          • .
          • Aldridge, JW, Thompson, JF, Walters, EA, Groh, JM and Gilman, S.
          • 1991.
          • Neostriatal unit activity related to movement preparation in a go/no-go task in the cat.
          • Soc. Neurosci. Abstr.
          • 17:
          • 1217
          • .
          • Groh, JM.
          • 1988.
          • Bachelor male feral horses: characteristics of group living and aggression.
          • .