• Publications of Jennifer M Groh

      • Journal Articles

          • DS Pages, JM Groh.
          • 2013.
          • Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization..
          • PloS one
          • 8:
          • e72562
          • .
          Publication Description

          A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a 'guess and check' heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain's reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.

          • NL Zucker, RM Merwin, CM Bulik, A Moskovich, JE Wildes, J Groh.
          • 2013.
          • Subjective experience of sensation in anorexia nervosa..
          • Behaviour research and therapy
          • 51:
          • 256-65
          • .
          Publication Description

          The nature of disturbance in body experience in anorexia nervosa (AN) remains poorly operationalized despite its prognostic significance. We examined the relationship of subjective reports of sensitivity to and behavioral avoidance of sensory experience (e.g., to touch, motion) to body image disturbance and temperament in adult women currently diagnosed with AN (n = 20), women with a prior history of AN who were weight restored (n = 15), and healthy controls with no eating disorder history (n = 24). Levels of sensitivity to sensation and attempts to avoid sensory experience were significantly higher in both clinical groups relative to healthy controls. Sensory sensitivity was associated with body image disturbance (r(56) = .51, p 

          • J. Lee, J.M. Groh.
          • 2013.
          • Different stimuli, different spatial codes: A visual map and an auditory rate code for oculomotor space in the primate superior colliculus..
          • PLOS One
          • In press:
          • .
          Publication Description

          Maps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive neurons whose firing rates vary roughly proportionately with sound azimuth. Within frontal space, maps and such rate codes involve different response patterns at the level of individual neurons. Maps consist of neurons exhibiting circumscribed receptive fields, whereas rate codes involve open-ended response patterns that peak in the periphery. This coding format discrepancy therefore poses a potential problem for brain regions responsible for representing both visual and auditory information. Here, we investigated the coding of auditory space in the primate superior colliculus(SC), a structure known to contain visual and oculomotor maps for guiding saccades. We report that, for visual stimuli, neurons showed circumscribed receptive fields consistent with a map, but for auditory stimuli, they had open-ended response patterns consistent with a rate or level-of-activity code for location. The discrepant response patterns were not segregated into different neural populations but occurred in the same neurons. We show that a read-out algorithm in which the site and level of SC activity both contribute to the computation of stimulus location is successful at evaluating the discrepant visual and auditory codes, and can account for subtle but systematic differences in the accuracy of auditory compared to visual saccades. This suggests that a given population of neurons can use different codes to support appropriate multimodal behavior.

          • KG Gruters, JM Groh.
          • 2012.
          • Sounds and beyond: multisensory and other non-auditory signals in the inferior colliculus..
          • Frontiers in neural circuits
          • 6:
          • 96
          • .
          Publication Description

          The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory sources. Neurophysiological studies corroborate that non-auditory stimuli can modulate auditory processing in the IC and even elicit responses independent of coincident auditory stimulation. In this article, we review anatomical and physiological evidence for multisensory and other non-auditory processing in the IC. Specifically, the contributions of signals related to vision, eye movements and position, somatosensation, and behavioral context to neural activity in the IC will be described. These signals are potentially important for localizing sound sources, attending to salient stimuli, distinguishing environmental from self-generated sounds, and perceiving and generating communication sounds. They suggest that the IC should be thought of as a node in a highly interconnected sensory, motor, and cognitive network dedicated to synthesizing a higher-order auditory percept rather than simply reporting patterns of air pressure detected by the cochlea. We highlight some of the potential pitfalls that can arise from experimental manipulations that may disrupt the normal function of this network, such as the use of anesthesia or the severing of connections from cortical structures that project to the IC. Finally, we note that the presence of these signals in the IC has implications for our understanding not just of the IC but also of the multitude of other regions within and beyond the auditory system that are dependent on signals that pass through the IC. Whatever the IC "hears" would seem to be passed both "upward" to thalamus and thence to auditory cortex and beyond, as well as "downward" via centrifugal connections to earlier areas of the auditory pathway such as the cochlear nucleus.

          • DA Bulkin, JM Groh.
          • 2012.
          • Distribution of visual and saccade related information in the monkey inferior colliculus..
          • Frontiers in neural circuits
          • 6:
          • 61
          • .
          Publication Description

          The inferior colliculus (IC) is an essential stop early in the ascending auditory pathway. Though normally thought of as a predominantly auditory structure, recent work has uncovered a variety of non-auditory influences on firing rate in the IC. Here, we map the location within the IC of neurons that respond to the onset of a fixation-guiding visual stimulus. Visual/visuomotor associated activity was found throughout the IC (overall, 84 of 199 sites tested or 42%), but with a far reduced prevalence and strength along recording penetrations passing through the tonotopically organized region of the IC, putatively the central nucleus (11 of 42 sites tested, or 26%). These results suggest that visual information has only a weak effect on early auditory processing in core regions, but more strongly targets the modulatory shell regions of the IC.

          • J Lee, JM Groh.
          • 2012.
          • Auditory signals evolve from hybrid- to eye-centered coordinates in the primate superior colliculus..
          • Journal of neurophysiology
          • 108:
          • 227-42
          • .
          Publication Description

          Visual and auditory spatial signals initially arise in different reference frames. It has been postulated that auditory signals are translated from a head-centered to an eye-centered frame of reference compatible with the visual spatial maps, but, to date, only various forms of hybrid reference frames for sound have been identified. Here, we show that the auditory representation of space in the superior colliculus involves a hybrid reference frame immediately after the sound onset but evolves to become predominantly eye centered, and more similar to the visual representation, by the time of a saccade to that sound. Specifically, during the first 500 ms after the sound onset, auditory response patterns (N = 103) were usually neither head nor eye centered: 64% of neurons showed such a hybrid pattern, whereas 29% were more eye centered and 8% were more head centered. This differed from the pattern observed for visual targets (N = 156): 86% were eye centered,

          • DA Bulkin, JM Groh.
          • 2012.
          • Distribution of eye position information in the monkey inferior colliculus..
          • Journal of neurophysiology
          • 107:
          • 785-95
          • .
          Publication Description

          The inferior colliculus (IC) is thought to have two main subdivisions, a central region that forms an important stop on the ascending auditory pathway and a surrounding shell region that may play a more modulatory role. In this study, we investigated whether eye position affects activity in both the central and shell regions. Accordingly, we mapped the location of eye position-sensitive neurons in six monkeys making spontaneous eye movements by sampling multiunit activity at regularly spaced intervals throughout the IC. We used a functional map based on auditory response patterns to estimate the anatomical location of recordings, in conjunction with structural MRI and histology. We found eye position-sensitive sites throughout the IC, including at 27% of sites in tonotopically organized recording penetrations (putatively the central nucleus). Recordings from surrounding tissue showed a larger proportion of sites indicating an influence of eye position (33-43%). When present, the magnitude of the change in activity due to eye position was often comparable to that seen for sound frequency. Our results indicate that the primary ascending auditory pathway is influenced by the position of the eyes. Because eye position is essential for visual-auditory integration, our findings suggest that computations underlying visual-auditory integration begin early in the ascending auditory pathway.

          • DA Bulkin, JM Groh.
          • 2012.
          • Distribution of visual and saccade related information in the monkey inferior colliculus..
          • Frontiers in neural circuits
          • 6:
          • 61
          • .
          Publication Description

          The inferior colliculus (IC) is an essential stop early in the ascending auditory pathway. Though normally thought of as a predominantly auditory structure, recent work has uncovered a variety of non-auditory influences on firing rate in the IC. Here, we map the location within the IC of neurons that respond to the onset of a fixation-guiding visual stimulus. Visual/visuomotor associated activity was found throughout the IC (overall, 84 of 199 sites tested or 42%), but with a far reduced prevalence and strength along recording penetrations passing through the tonotopically organized region of the IC, putatively the central nucleus (11 of 42 sites tested, or 26%). These results suggest that visual information has only a weak effect on early auditory processing in core regions, but more strongly targets the modulatory shell regions of the IC.

          • DA Bulkin, JM Groh.
          • 2011.
          • Systematic mapping of the monkey inferior colliculus reveals enhanced low frequency sound representation..
          • Journal of neurophysiology
          • 105:
          • 1785-97
          • .
          Publication Description

          We investigated the functional architecture of the inferior colliculus (IC) in rhesus monkeys. We systematically mapped multiunit responses to tonal stimuli and noise in the IC and surrounding tissue of six rhesus macaques, collecting data at evenly placed locations and recording nonresponsive locations to define boundaries. The results show a modest tonotopically organized region (17 of 100 recording penetration locations in 4 of 6 monkeys) surrounded by a large mass of tissue that, although vigorously responsive, showed no clear topographic arrangement (68 of 100 penetration locations). Rather, most cells in these recordings responded best to frequencies at the low end of the macaque auditory range. The remaining 15 (of 100) locations exhibited auditory responses that were not sensitive to sound frequency. Potential anatomical correlates of functionally defined regions and implications for midbrain auditory prosthetic devices are discussed.

          • JM Groh.
          • 2011.
          • Effects of Initial Eye Position on Saccades Evoked by Microstimulation in the Primate Superior Colliculus: Implications for Models of the SC Read-Out Process..
          • Frontiers in integrative neuroscience
          • 4:
          • 130
          • .
          Publication Description

          The motor layers of the superior colliculus (SC) are thought to specify saccade amplitude and direction, independent of initial eye position. However, recent evidence suggests that eye position can modulate the level of activity of SC motor neurons. In this study, we tested whether initial eye position has an effect on microstimulation-evoked saccade amplitude. High (>300 Hz) and low (

          • J. X. Maier and J.M. Groh.
          • 2010.
          • Comparison of gain-like properties of eye position signals in inferior colliculus versus auditory cortex of primates..
          • Frontiers in Integrative Neuroscience
          • 4:
          • 121-132
          • .
          Publication Description

          We evaluated to what extent the influence of eye position in the auditory pathway of primates can be described as a gain field. We compared single unit activity in the inferior colliculus (IC), core auditory cortex (A1) and the caudomedial belt (CM) region of auditory cortex (AC) in primates, and found stronger evidence for gain field-like interactions in the IC than in AC. In the IC, eye position signals showed both multiplicative and additive interactions with auditory responses, whereas in AC the effects were not as well predicted by a gain field model.

          • J.M. Groh.
          • 2010.
          • Effects of initial eye position on saccades evoked by microstimulation in the primate superior colliculus: implications for models of the SC read-out process..
          • Frontiers in Integrative Neuroscience
          • In press:
          • .
          Publication Description

          The motor layers of the superior colliculus (SC) are thought to specify saccade amplitude and direction, independent of initial eye position. However, recent evidence suggests that eye position can modulate the level of activity of SC motor neurons. In this study, we tested whether initial eye position has an effect on microstimulation-evoked saccade amplitude. High (>300 Hz) and low (

          • JX Maier, JM Groh.
          • 2009.
          • Multisensory guidance of orienting behavior..
          • Hearing research, Netherlands
          • 258:
          • 106-12
          • .
          Publication Description

          We use both vision and audition when localizing objects and events in our environment. However, these sensory systems receive spatial information in different coordinate systems: sounds are localized using inter-aural and spectral cues, yielding a head-centered representation of space, whereas the visual system uses an eye-centered representation of space, based on the site of activation on the retina. In addition, the visual system employs a place-coded, retinotopic map of space, whereas the auditory system's representational format is characterized by broad spatial tuning and a lack of topographical organization. A common view is that the brain needs to reconcile these differences in order to control behavior, such as orienting gaze to the location of a sound source. To accomplish this, it seems that either auditory spatial information must be transformed from a head-centered rate code to an eye-centered map to match the frame of reference used by the visual system, or vice versa. Here, we review a number of studies that have focused on the neural basis underlying such transformations in the primate auditory system. Although, these studies have found some evidence for such transformations, many differences in the way the auditory and visual system encode space exist throughout the auditory pathway. We will review these differences at the neural level, and will discuss them in relation to differences in the way auditory and visual information is used in guiding orienting movements.

          • N Kopco, IF Lin, BG Shinn-Cunningham, JM Groh.
          • 2009.
          • Reference frame of the ventriloquism aftereffect..
          • Journal of Neuroscience
          • 29:
          • 13809-14
          • .
          Publication Description

          Seeing the image of a newscaster on a television set causes us to think that the sound coming from the loudspeaker is actually coming from the screen. How images capture sounds is mysterious because the brain uses different methods for determining the locations of visual versus auditory stimuli. The retina senses the locations of visual objects with respect to the eyes, whereas differences in sound characteristics across the ears indicate the locations of sound sources referenced to the head. Here, we tested which reference frame (RF) is used when vision recalibrates perceived sound locations. Visually guided biases in sound localization were induced in seven humans and two monkeys who made eye movements to auditory or audiovisual stimuli. On audiovisual (training) trials, the visual component of the targets was displaced laterally by 5-6 degrees. Interleaved auditory-only (probe) trials served to evaluate the effect of experience with mismatched visual stimuli on auditory localization. We found that the displaced visual stimuli induced ventriloquism aftereffect in both humans (approximately 50% of the displacement size) and monkeys (approximately 25%), but only for locations around the trained spatial region, showing that audiovisual recalibration can be spatially specific. We tested the reference frame in which the recalibration occurs. On probe trials, we varied eye position relative to the head to dissociate head- from eye-centered RFs. Results indicate that both humans and monkeys use a mixture of the two RFs, suggesting that the neural mechanisms involved in ventriloquism occur in brain region(s) using a hybrid RF for encoding spatial information.

          • Mullette-Gillman, O. A., Cohen, Y. E. and Groh, JM..
          • 2009.
          • Motor-related signals in the intraparietal cortex encode locations in a hybrid, rather than eye-centered, reference frame.
          • Cerebral Cortex
          • 19:
          • 1761-1775
          • .
          Publication Description

          The reference frame used by intraparietal cortex neurons to encode locations is controversial. Many previous studies have suggested eye-centered coding, whereas we have reported that visual and auditory signals employ a hybrid reference frame (i.e., a combination of head- and eye-centered information) (Mullette-Gillman et al. 2005). One possible explanation for this discrepancy is that sensory-related activity, which we studied previously, is hybrid, whereas motor-related activity might be eye centered. Here, we examined the reference frame of visual and auditory saccade-related activity in the lateral and medial banks of the intraparietal sulcus (areas lateral intraparietal area [LIP] and medial intraparietal area [MIP]) of 2 rhesus monkeys. We recorded from 275 single neurons as monkeys performed visual and auditory saccades from different initial eye positions. We found that both visual and auditory signals reflected a hybrid of head- and eye-centered coordinates during both target and perisaccadic task periods rather than shifting to an eye-centered format as the saccade approached. This account differs from numerous previous recording studies. We suggest that the geometry of the receptive field sampling in prior studies was biased in favor of an eye-centered reference frame. Consequently, the overall hybrid nature of the reference frame was overlooked because the non-eye-centered response patterns were not fully characterized.

          • U. Werner-Reiss, J.M. Groh.
          • 2008.
          • A rate code for sound azimuth in monkey auditory cortex: implications for human neuroimaging studies..
          • Journal of Neuroscience
          • 28:
          • 3747-3758
          • .
          Publication Description

          Is sound location represented in the auditory cortex of humans and monkeys? Human neuroimaging experiments have had only mixed success at demonstrating sound location sensitivity in primary auditory cortex. This is in apparent conflict with studies in monkeys and other animals, in which single-unit recording studies have found stronger evidence for spatial sensitivity. Does this apparent discrepancy reflect a difference between humans and animals, or does it reflect differences in the sensitivity of the methods used for assessing the representation of sound location? The sensitivity of imaging methods such as functional magnetic resonance imaging depends on the following two key aspects of the underlying neuronal population: (1) what kind of spatial sensitivity individual neurons exhibit and (2) whether neurons with similar response preferences are clustered within the brain. To address this question, we conducted a single-unit recording study in monkeys. We investigated the nature of spatial sensitivity in individual auditory cortical neurons to determine whether they have receptive fields (place code) or monotonic (rate code) sensitivity to sound azimuth. Second, we tested how strongly the population of neurons favors contralateral locations. We report here that the majority of neurons show predominantly monotonic azimuthal sensitivity, forming a rate code for sound azimuth, but that at the population level the degree of contralaterality is modest. This suggests that the weakness of the evidence for spatial sensitivity in human neuroimaging studies of auditory cortex may be attributable to limited lateralization at the population level, despite what may be considerable spatial sensitivity in individual neurons.

          • Mullette-Gillman, O. A., Cohen, Y. E. and Groh, JM..
          • 2008.
          • Motor-related signals in the intraparietal cortex encode locations in a hybrid, rather than eye-centered, reference frame..
          • Cerebral Cortex
          • Epub ahead of print:
          • .
          Publication Description

          The reference frame used by intraparietal cortex neurons to encode locations is controversial. Many previous studies have suggested eye-centered coding, whereas we have reported that visual and auditory signals employ a hybrid reference frame (i.e., a combination of head- and eye-centered information) (Mullette-Gillman et al. 2005). One possible explanation for this discrepancy is that sensory-related activity, which we studied previously, is hybrid, whereas motor-related activity might be eye centered. Here, we examined the reference frame of visual and auditory saccade-related activity in the lateral and medial banks of the intraparietal sulcus (areas lateral intraparietal area [LIP] and medial intraparietal area [MIP]) of 2 rhesus monkeys. We recorded from 275 single neurons as monkeys performed visual and auditory saccades from different initial eye positions. We found that both visual and auditory signals reflected a hybrid of head- and eye-centered coordinates during both target and perisaccadic task periods rather than shifting to an eye-centered format as the saccade approached. This account differs from numerous previous recording studies. We suggest that the geometry of the receptive field sampling in prior studies was biased in favor of an eye-centered reference frame. Consequently, the overall hybrid nature of the reference frame was overlooked because the non-eye-centered response patterns were not fully characterized.

          • Porter, KK, Metzger, RR, and Groh, JM.
          • 2007.
          • Visual- and saccade-related signals in the primate inferior colliculus..
          • Proceedings of the National Academy of Sciences. 104(45): 17855-60.
          • 104:
          • 17855-60
          • .
          Publication Description

          The inferior colliculus (IC) is normally thought of as a predominantly auditory structure because of its early position in the ascending auditory pathway just before the auditory thalamus. Here, we show that a majority of IC neurons (64% of 180 neurons) in awake monkeys carry visual- and/or saccade-related signals in addition to their auditory responses (P

          Press coverage of this work has appeared in Scientific American (ScientificAmerican.com), Fox News (foxnews.com), the CBC radio program “Quirks and Quarks”, the Radio New Zealand program “Nights”, the Telegraph, the Italian science magazine “Newton”, and LiveScience.com and numerous other online science news web sites.

          • Werner-Reiss U, Porter KK, Underhill AM, Groh JM..
          • 2006.
          • Long lasting attenuation by prior sounds in auditory cortex of awake primates.
          • Exp Brain Res
          • 168:
          • 272-6
          • .
          Publication Description

          How the brain responds to sequences of sounds is a question of great relevance to a variety of auditory perceptual phenomena. We investigated how long the responses of neurons in the primary auditory cortex of awake monkeys are influenced by the previous sound. We found that responses to the second sound of a two-sound sequence were generally attenuated compared to the response that sound evoked when it was presented first. The attenuation remained evident at the population level even out to inter-stimulus intervals (ISIs) of 5 s, although it was of modest size for ISIs >2 s. Behavioral context (performance versus non-performance of a visual fixation task during sound presentation) did not influence the results. The long time course of the first sound's influence suggests that, under natural conditions, neural responses in auditory cortex are rarely governed solely by the current sound

          • Metzger, RR.,Greene, NT, Porter, KK and Groh, JM.
          • 2006.
          • Effects of reward and behavioral context on neural activity in the primate inferior colliculus.
          • Journal of Neuroscience
          • 26:
          • 7468-7476
          • .
          • Bulkin, DA., Groh, JM.
          • 2006.
          • Seeing sounds: Visual and auditory interactions in the brain..
          • Current Opinions in Neurobiology
          • 16:
          • 415-419
          • .
          • Porter, KK., Groh, JM.
          • 2006.
          • The "other" transformation required for visual-auditory integration: representational format.
          • Progress in Brain Research
          • 155:
          • 313-323
          • .
          • Porter, KK., Metzger, RR Groh, JM.
          • 2006.
          • The representation of eye position in primate inferior colliculus.
          • Journal of Neurophysiology
          • 95:
          • 1826-42
          • .
          • Mullette-Gillman, OA., Cohen, YE, Groh, JM.
          • 2005.
          • Eye-centered, head-centered, and complex coding of visual and auditory targets in the intraparietal sulcus.
          • Journal of Neurophysiology
          • 94:
          • 2331-52
          • .
          • Metzger RR, Mullette-Gillman OA, Underhill AM, Cohen YE, Groh JM.
          • 2004.
          • Auditory saccades from different eye positions in the monkey: implications for coordinate transformations.
          • Journal of Neurophysiology
          • 92:
          • 2622-7
          • .
          • Groh, JM, Kelly KA and Underhill, AM.
          • 2003.
          • A monotonic code for sound azimuth in primate inferior colliculus.
          • Journal of Cognitive Neuroscience
          • 15:
          • 1217-1231
          • .
          • Groh, JM and Gazzaniga, MS.
          • 2003.
          • How the brain keeps time.
          • Daedalus
          • Spring:
          • 56-61
          • .
          • Werner-Reiss, U, Kelly, KA, Trause, AS, Underhill, AM and Groh, JM.
          • 2003.
          • Eye position affects activity in primary auditory cortex of primates.
          • Current Biology
          • 13:
          • 554-562
          • .
          • Groh JM, Trause, A. S., Underhill, A. M., Clark, K. R, Inati, S.
          • 2001.
          • Eye position influences auditory responses in primate inferior colliculus.
          • Neuron
          • 29:
          • 509-518
          • .
          • J.M. Groh.
          • 2001.
          • Converting neural signals from place codes to rate codes.
          • Biol. Cybern
          • 85:
          • 159-65
          • .
          • Boucher, L, Groh JM, Hughes HC.
          • 2001.
          • Visual latency and the mislocalization of perisaccadic stimuli.
          • Vision Research
          • 41:
          • 2631-2644
          • .
          • Born, RT, Groh, JM, Zhao, R, and Lukaswewycz, SJ.
          • 2000.
          • Segregation of object and background motion in visual area MT: effects of microstimulation on eye movements.
          • Neuron
          • 26:
          • 725-734
          • .
          • Groh, JM.
          • 2000.
          • Predicting perception from population codes.
          • Nature Neuroscience
          • 3:
          • 201-202
          • .
          • Groh, JM.
          • 1998.
          • Reading neural representations.
          • Neuron
          • 21:
          • 661-664
          • .
          • Wickersham, I. and Groh, JM.
          • 1998.
          • Electrically evoking sensory experience.
          • Current Biology
          • 8:
          • R412-R414
          • .
          • Groh, JM, Born, RT, and Newsome, WT.
          • 1997.
          • How is a sensory map read out? Effects of microstimulation in area MT on smooth pursuit and saccadic eye movements.
          • J. Neurosci.
          • 17:
          • 4312-4330
          • .
          • Groh, JM and Sparks, DL.
          • 1996.
          • Saccades to somatosensory targets: I. Behavioral characteristics.
          • J. Neurophysiol.
          • 75:
          • 412-427
          • .
          • Groh, JM and Sparks, DL.
          • 1996.
          • Saccades to somatosensory targets: II. Motor convergence in primate superior colliculus.
          • J. Neurophysiol.
          • 75:
          • 428-438
          • .
          • Groh, JM and Sparks, DL.
          • 1996.
          • Saccades to somatosensory targets: III. Influence of eye position on somatosensory activity in primate superior colliculus.
          • J. Neurophysiol.
          • 75:
          • 439-453
          • .
          • Groh, JM, Born, RT, and Newsome, WT.
          • 1996.
          • Interpreting sensory maps in visual cortex.
          • IBRO News
          • 24:
          • 11-12
          • .
          • Groh, JM, Seidemann, E, and Newsome, WT.
          • 1996.
          • Neural fingerprints of visual attention.
          • Current Biol.
          • 11:
          • 1406-1409
          • .
          • Groh, JM and Sparks, DL.
          • 1992.
          • Two models for transforming auditory signals from head-centered to eye-centered coordinates.
          • Biol. Cybern.
          • 67:
          • 291-302
          • .
      • Books

          • J.M. Groh.
          • 2013.
          • Making Space: How the Brain Knows Where Things Are.
          • Harvard University Press.
          Publication Description

          The book is about how the brain creates our sense of spatial location from a variety of sensory and motor sources, and how this spatial sense in turn shapes our cognitive abilities. Knowing where things are is effortless. But “under the hood”, your brain devotes a tremendous amount of computational power to figuring out even the simplest of details about the world around you and your position in it. Recognizing your mother, finding your phone, going to the grocery store, playing the banjo – these require careful sleuthing and coordination across different sensory and motor domains. The book traces the brain’s detective work to create this sense of space and argues that the brain’s spatial focus permeates our cognitive abilities, affecting the way we think and remember. The book begins by tracing the link from patterns of light in the world to the brain's deductions regarding objects, boundaries, and visual space in three dimensions. Later chapters outline similar neural deductions regarding sound location, body posture, balance, and movement, and how such information is synthesized to allow us to navigate through space. The final two chapters of the book consider how the brain's spatial representations do "double duty" to aid in memory and other types of mental activity such as thinking and reasoning.

      • Chapters in Books

          • Groh, JM and Pai, D.
          • 2008.
          • Looking at sounds: neural mechanisms in the primate brain..
          • .
          Publication Description

          When you hear a salient sound, it is natural to look at it to find out what is happening. Orienting the eyes to look at sounds is essential to our ability to identify and understand the events occurring in our environment. This behavior involves both sensorimotor and multisensory integration: a sound elicits a movement of the visual sense organ, the eye, to bring the source of the sound under visual scrutiny. How are auditory signals converted into oculomotor commands? This chapter describes our recent work concerning the necessary computational steps between sound and eye movement, and how they may be implemented in neural populations in the primate brain.

          • Kelly, KA, Metzger, RR, Mullette-Gillman, OA., Werner-Reiss U., Groh, JM.
          • 2003.
          • Representation of sound location in the primate brain.
          • .
          • Groh, JM and Werner-Reiss, U.
          • 2002.
          • Visual and auditory integration.
          • .
          • Sparks, DL and Groh, JM.
          • 1995.
          • The superior colliculus: a window to problems in integrative neuroscience.
          • .
      • Commentaries/Book Reviews

          • J.M. Groh.
          • 2011.
          • Book Review: The Tell Tale Brain..
          • Journal of Clinical Investigation
          • 121:
          • 2953
          • .
      • Other

          • J.M. Groh, O. A. Mullette-Gillman, Y. E. Cohen.
          • 2006.
          • Auditory and visual reference frames in the intraparietal sulcus.
          • Cosyne workshop
          • .
          • J.M. Groh.
          • 2006.
          • Hybrid reference frames: why?.
          • Neural Control of Movement.
          • .
          • Werner-Reiss, U, Greene, NT, Underhill, AM, Metzger, RR, Groh, JM.
          • 2005.
          • The representation of sound frequency in the primate inferior colliculus.
          • Association for Research in Otolaryngology Abstr.
          • .
          • Werner-Reiss, U, Porter, KK, Greene, NT, Larue, DT, Winer, JA and Groh, JM.
          • 2005.
          • Eye position signals are distributed throughout the primate inferior colliculus.
          • Soc. Neurosci. Abstr.
          • .
          • Porter, KK, Metzger, RR, Werner-Reiss, U, Underhill, AM, Groh, JM.
          • 2005.
          • Visual responses in auditory neurons of the primate inferior colliculus.
          • Soc. Neurosci. Abstr.
          • .
          • Mullette-Gillman, OA; Cohen, YE; Groh, JM.
          • 2004.
          • Reference frame of auditory and visual signals in bimodal neurons of the primate lateral intraparietal area (LIP).
          • Soc. Neurosci. Abstr.
          • .
          • Metzger, RR, Kelly, KA, Groh, JM.
          • 2004.
          • Sensitivity to eye position in the inferior colliculus of the monkey during an auditory saccade task.
          • Soc. Neurosci. Abstr.
          • .
          • Werner-Reiss, U, Underhill, AM, and Groh, JM.
          • 2004.
          • The representation of auditory space in core auditory cortex of primates maintaining fixation.
          • Soc. Neurosci. Abstr.
          • .
          • Kelly, KA, Werner-Reiss, U, Underhill, AM and Groh, JM.
          • 2003.
          • Eye position signals change shape along the primate auditory pathway.
          • Soc. Neurosci. Abstr.
          • .
          • Metzger, RR, Mullette-Gillman, OA, Underhill, AM, Cohen, YE and Groh, JM.
          • 2003.
          • Effect of initial eye position on saccades to auditory targets in monkeys.
          • Soc. Neurosci. Abstr.
          • .
          • Mullette-Gillman, OA, Cohen, YE and Groh, JM.
          • 2003.
          • Similar eye position influences on auditory and visual responses in the lateral intraparietal area, LIP, of primates.
          • Soc. Neurosci. Abstr.
          • .
          • Werner-Reiss, U, Kelly, KA, Underhill, AM and Groh, JM.
          • 2003.
          • Long inter-stimulus intervals affect responses in primate auditory cortex.
          • Soc. Neurosci. Abstr.
          • .
          • Kelly KA, Werner-Reiss U, Underhill AM, and Groh JM.
          • 2002.
          • Eye position affects a wide range of auditory cortical neurons in primates.
          • Soc. Neurosci. Abstr.
          • 845.1:
          • .
          • Mullette-Gillman OA, Cohen YE, and Groh JM.
          • 2002.
          • Assessing the spatial alignment of auditory and visual responses in the inferior parietal sulcus.
          • Soc. Neurosci. Abstr.
          • 57.19
          • .
          • Metzger RR and Groh JM.
          • 2002.
          • Linking primate inferior colliculus neural activity to sound localization performance.
          • Soc. Neurosci. Abstr.
          • 845.2
          • .
          • Groh, JM, Underhill, AM.
          • 2001.
          • Coding of sound location in primate inferior colliculus.
          • Soc. Neurosci. Abstr.
          • 27:
          • 60.1
          • .
          • Metzger, R R, Underhill, AM and Groh, JM.
          • 2001.
          • Time course of eye position influence in primate inferior colliculus.
          • Soc. Neurosci. Abstr.
          • 27:
          • 60.3
          • .
          • Werner-Reiss, U, Kelly, KA, Underhill, AM and Groh, JM.
          • 2001.
          • Eye position tuning in primate auditory cortex.
          • Soc. Neurosci. Abstr.
          • 27:
          • 60.2
          • .
          • Trause, AS, Werner-Reiss, U, Underhill, AM, Groh, JM.
          • 2000.
          • Effects of eye position on auditory signals in primate auditory cortex.
          • Soc. Neurosci. Abstr.
          • 26:
          • 1977
          • .
          • Clark, KR, Trause, AS, Underhill, AM, Groh, JM.
          • 2000.
          • Effects of eye position on auditory signals in primate inferior colliculus.
          • Soc. Neurosci. Abstr.
          • 26:
          • 1977
          • .
          • Boucher, L, Groh JM, Hughes, HC.
          • 2000.
          • Oculomotor localization of perisaccadic auditory targets.
          • Soc. Neurosci. Abstr.
          • 26:
          • 1329
          • .
          • Boucher, L, Groh, JM, and Hughes, HC.
          • 1999.
          • Contributions of visual processing delays to mislocalization of perisaccadic stimuli.
          • Soc. Neurosci. Abstr.
          • 29:
          • .
          • Born, RT, Zhao, R, and Lukasewycz, SJ, Groh, JM.
          • 1999.
          • Representation of figure and ground in visual area MT.
          • Soc. Neurosci. Abstr.
          • .
          • Groh, JM.
          • 1997.
          • A model for transforming signals from a place code to a rate code.
          • Soc. Neurosci. Abstr.
          • 23:
          • 1560
          • .
          • Groh, JM, Born, RT, and Newsome, WT.
          • 1996.
          • A comparison of the effects of microstimulation in area MT on saccades and smooth pursuit eye movements.
          • Invest. Ophthal. Vis.. Sci.
          • 37:
          • S472
          • .
          • Groh, JM, Born, RT, and Newsome, WT.
          • 1995.
          • Microstimulation of area MT affects both saccades and smooth pursuit eye movements.
          • Soc. Neurosci. Abstr.
          • 21:
          • 281
          • .
          • Born, RT, Groh, JM, and Newsome, WT.
          • 1995.
          • Functional architecture of primate area MT probed with microstimulation: effects on eye movements.
          • Soc. Neurosci. Abstr.
          • 21:
          • 281
          • .
          • Shadlen, MN, Groh, JM, Salzman, CD and Newsome, WT.
          • 1994.
          • . Responses of LIP neurons during a motion discrimination task: a decision process in action?.
          • Soc. Neurosci. Abstr.
          • 20:
          • 1279
          • .
          • Groh, JM.
          • 1993.
          • Coordinate transformations, sensorimotor integration, and the neural basis of saccades to somatosensory targets.
          • .
          • Groh, JM and Sparks, DL.
          • 1993.
          • Motor activity in the primate superior colliculus (SC) during saccades to somatosensory and visual targets.
          • Invest. Ophthal. Vis.. Sci.
          • 34:
          • 1137
          • .
          • Glimcher, PW, Groh, JM and Sparks, DL.
          • 1993.
          • Low-frequency collicular stimulation specifies saccadic amplitude gradually.
          • Invest. Ophthal. Vis.. Sci.
          • 34:
          • 1137
          • .
          • Groh, JM and Sparks, DL.
          • 1993.
          • Somatosensory activity in the superior colliculus (SC) influenced by eye position.
          • Soc. Neurosci. Abstr.
          • 19:
          • 858
          • .
          • Groh, JM and Sparks, DL.
          • 1992.
          • Characteristics of saccades to somatosensory targets.
          • Soc. Neurosci. Abstr.
          • 18:
          • 701
          • .
          • Groh, JM and Sparks, DL.
          • 1991.
          • A model for transforming auditory signals from head-centered to eye-centered coordinates.
          • Soc. Neurosci. Abstr.
          • 17:
          • 458
          • .
          • Aldridge, JW, Thompson, JF, Walters, EA, Groh, JM and Gilman, S.
          • 1991.
          • Neostriatal unit activity related to movement preparation in a go/no-go task in the cat.
          • Soc. Neurosci. Abstr.
          • 17:
          • 1217
          • .
          • Groh, JM.
          • 1988.
          • Bachelor male feral horses: characteristics of group living and aggression.
          • .
  • brain scan