Functional magnetic resonance imaging (fMRI) is a new technique that allows human brain activity to be measured non-invasively with high spatial and temporal precision. Using fMRI, we study the neural mechanisms underlying cognition and perception. The two main themes of our research are visual motion perception and multisensory integration. Our visual system is remarkably adept at extracting information from the moving objects that surround us every day.
|A lateral view of a human brain showing the results of an fMRI experiment in which the subject viewed moving and static figures. Orange regions were more active when the subject viewed moving figures, while green and blue regions responded similarly to moving and static figures.|
For instance, we must calculate the speed and direction of an incoming ball in order to catch it, or determine if the driver in the car next to ours is waving at us in a friendly fashion or shaking their fist in an angry fashion. Different regions of human lateral temporal cortex are important in processing different kinds of visual motion. Area MT serves as a first-stage of motion processing. Higher areas in superior temporal sulcus are particularly important for processing biological motion, while areas in middle temporal gyrus are most responsive to the motion of man-made manipulable objects, such as hammering or sawing.
|The visual cortex (located in the back of the head, illustrated in red) responds to the image of the bell, while auditory cortex (in blue) responds to the sound of the bell. Neurons in the superior temporal sulcus (in green) integrate auditory and visual information.|
The goal of our research is to determine how the brain translates the rapidly-changing visual information into meaningful actionable concepts such as "wave" or "fist-shake", "hammer" or "saw". This research is important for understanding the difficulties faced by patients who have difficulties interpreting biological motion, such as autism spectrum disorder, and may also have implications for patients with language learning impairments, such as those who have difficulties with the rapid processing required for reading.
In addition to visual information, our brain also receives input from other sensory modalities. For instance, even if we cannot see our mobile phone blinking, we can hear its ring or feel its vibration in our pocket. These different modalities are encoded by our brain in very different ways. The auditory system is most concerned with the frequency (high or low-pitched sounds), as is the tactile system (slow stroking vs quick vibrations). In contrast, our visual system is organized by the spatial location of stimuli--a stroke may damage our ability to see objects on the left side of the room but not the right side. Although each sensory modality is organized fundamentally differently, our brain must integrate the information provided by the different modalities in order to make decisions. For instance, is the phone ringing or not? Our research has shown that regions of superior temporal sulcus are especially important for this process of multisensory integration. In superior temporal sulcus, different sensory inputs converge into patches of cortex, allowing multisensory integration to occur.
Beauchamp, MS, Lee, KE, Haxby, JV, Martin, A. (2002) Parallel visual motion processing streams for manipulable objects and human movements. Neuron 34: 149-159.
Beauchamp, MS. (2003) Detection of eye movements from fMRI data. Magn Reson Med 49: 376-380.
Beauchamp, MS, Lee, KE, Haxby, JV, Martin, A. (2003) FMRI responses to video and point-light displays of moving humans and manipulable objects. J Cogn Neurosci 15: 991-1001.
Petit, L, Beauchamp, MS. (2003) Neural basis of visually guided head movements studied with fMRI. J Neurophysiol 89: 2516-2527.
Beauchamp, MS, Lee, KE, Argall, BD, Martin, A. (2004) Integration of auditory and visual information about objects in superior temporal sulcus. Neuron 41: 809-823.
Beauchamp, MS, Argall, BD, Bodurka, J, Duyn, JH, Martin, A. (2004) Unraveling multisensory integration: patchy organization within human STS multisensory cortex. Nat Neurosci 7: 1190-1192.
Beauchamp, MS. (2005) Statistical criteria in FMRI studies of multisensory integration. Neuroinformatics 3: 93-114.
Beauchamp MS. (2005) See me, hear me, touch me: multisensory integration in lateral occipital-temporal cortex. Curr Opin Neurobiol. 2005 Apr;15(2):145-53.
Argall, B. D., Saad, Z. S., and Beauchamp, M. S. (2006) Simplified intersubject averaging on the cortical surface using SUMA. Hum Brain Mapp 27:14-27.
Beauchamp, M. S., and Martin, A. (2007) Grounding object concepts in perception and action: evidence from fMRI studies of tools. Cortex 43:461-468.
Simmons, W. K., Ramjee, V., Beauchamp, M. S., McRae, K., Martin, A., and Barsalou, L. W. (2007) A common neural substrate for perceiving and knowing about color. Neuropsychologia 45:2802-2810.
Beauchamp, M. S., Yasar, N. E., Kishan, N., and Ro, T. (2007) Human MST but not MT responds to tactile stimulation. J Neurosci 27:8261-8267.
Ro, T., Farne, A., Johnson, R., Wedeen, V., Chu, Z., Want, Z., Hunter, J., and Beauchamp, M.S. (2007) Feeling sounds after a thalamic lesion. Ann Neurol. Nov;62(5):433-41.
Murphey DK, Yoshor D, Beauchamp MS. (2008) Perception matches selectivity in the human anterior color center. Curr Biol. Feb 12;18(3):216-20.
Beauchamp MS, Yasar NE, Frye RE, Ro T. (2008) Touch, sound and vision in human superior temporal sulcus. Neuroimage. Jul 1;41(3):1011-20.
Beauchamp MS, Ro T. (2008) Neural substrates of sound-touch synesthesia after a thalamic lesion. J Neurosci. Dec 10;28(50):13696-702.
Saad ZS, Glen DR, Chen G, Beauchamp MS, Desai R, Cox RW. (2009) A new method for improving functional-to-structural MRI alignment using local Pearson correlation. Neuroimage. Feb 1;44(3):839-48.
Frye RE, Beauchamp MS. (2009) Receptive language organization in high-functioning autism. J Child Neurol. Feb;24(2):231-6.
Murphey DK, Maunsell JH, Beauchamp MS, Yoshor D. (2009) Perceiving electrical stimulation of identified human visual areas. Proc Natl Acad Sci U S A. Mar 31;106(13):5389-93.
Ro T, Hsu J, Yasar NE, Elmore LC, Beauchamp MS. (2009) Sound enhances touch perception. Exp Brain Res. May;195(1):135-43.
Dulay MF, Murphey DK, Sun P, David YB, Maunsell JH, Beauchamp MS, Yoshor D. (2009) Computer-controlled electrical stimulation for quantitative mapping of human cortical function. J Neurosurg. Jun;110(6):1300-3.
Ellmore TM, Beauchamp MS, O'Neill TJ, Dreyer S, Tandon N. (2009) Relationships between essential cortical language sites and subcortical pathways. J Neurosurg. Oct;111(4):755-66.
Beauchamp MS, Laconte S, Yasar N. (2009) Distributed representation of single touches in somatosensory and visual cortex. Hum Brain Mapp. Oct;30(10):3163-71.
Ellmore TM, Beauchamp MS, Breier JI, Slater JD, Kalamangalam GP, O'Neill TJ, Disano MA, Tandon N. (2010) Temporal lobe white matter asymmetry and language laterality in epilepsy patients. Neuroimage. Feb 1;49(3):2033-44.
Frye RE, Liederman J, Malmberg B, McLean J, Strickland D, Beauchamp MS. (2010) Surface Area Accounts for the Relation of Gray Matter Volume to Reading-Related Skills and History of Dyslexia. Cereb Cortex. Feb 12.
Beauchamp MS, Nath AR, Pasalar S. (2010) fMRI-Guided transcranial magnetic stimulation reveals that the superior temporal sulcus is a cortical locus of the McGurk effect. J Neurosci. Feb 17;30(7):2414-7.
Pasalar S, Ro T, Beauchamp MS. (2010) TMS of posterior parietal cortex disrupts visual tactile multisensory integration. Eur J Neurosci. May;31(10):1783-90.
Beauchamp MS, Pasalar S, Ro T. (2010) Neural substrates of reliability-weighted visual-tactile multisensory integration. Frontiers in Systems Neuroscience. June;4(25).
Sevy, ABG, Bortfeld, H, Huppert, TJ, Beauchamp, MS, Tonini, RE, Oghalai, JS. (2010) Neuroimaging with Near-Infrared Spectroscopy Demonstrates Speech-Evoked Activity in the Auditory Cortex of Deaf Children Following Cochlear Implantation. Hearing Research. Dec; 270: 39-47.
Nath, AR and Beauchamp, MS. (2011) Dynamic Changes in Superior Temporal Sulcus Connectivity During Perception of Noisy Audiovisual Speech. Journal of Neuroscience. Feb 2;31(5):1704-1714.
Nath, AR and Beauchamp, MS. (2011) A Neural Basis for Interindividual Differences in the McGurk Effect, a Multisensory Speech Illusion. Neuroimage. July 20 (e-Pub).
Beauchamp, MS, Beurlot, MR, Fava, EE, Nath, AR, Parikh, NA, Saad, ZS, Bortfeld, H, Oghalai, JS. (2011) The Developmental Trajectory of Brain-Scalp Distance from Birth through Childhood: Implications for Functional Neuroimaging. PLoS ONE, 21 Sep, 10.1371/journal.pone.0024981.
Nath, AR, Fava EE and Beauchamp, MS. (2011) Neural Correlates of Interindividual Differences in Children's Audiovisual Speech Perception. Journal of Neuroscience. Sept 28;31(39)13963-13971.
Search PubMed for additional articles.