Please enable javascript to view this page in its intended format.

Queen's University
 

 

Speech Perception and Production Lab

People

Collaborations

Research

X-Ray Database

Funding

Publications

Contact

Audiovisual Speech Perception

Audiovisual Speech Perception Image
Our work on audiovisual speech perception focuses on three aspects of face-to-face communication:

1. Studies of the visual information for speech. In these studies we focus on the analysis of the facial dynamics and what role they play in speech perception. This work involves detailed kinematic analysis of facial motion and psychophysics of face perception.

  • Lucero, J., Maciel, S., Johns, D., & Munhall, K.G. (2005). Empirical modeling of human face kinematics during speech using motion clustering. Journal of the Acoustical Society of America, 118, 405-409.
  • Munhall, K.G., Jones, J.A., Callan, D. Kuratate, T., & Vatikiotis-Bateson, E. (2004). Visual prosody and speech intelligibility: Head movement improves auditory speech perception. Psychological Science, 15, 133-137.
  • Munhall, K.G., Kroos, C., Jozan, G. & Vatikiotis-Bateson, E. (2004). Spatial frequency requirements for audiovisual speech perception. Perception and Psychophysics, 66, 574-583.
  • Campbell, R., Zihl, J., Massaro, D., Munhall, K., & Cohen, M. (1997). Speechreading in a patient with severe impairment in visual motion perception (Akinetopsia). Brain, 120, 1793-1803.
  • Munhall, K.G. , Gribble, P., Sacco, L., & Ward, M. (1996). Temporal constraints on the McGurk Effect. Perception and Psychophysics, 58, 351-362.


2. Eye movement of perceivers during audiovisual speech perception. In these studies we have examined the patterns of eye movements when subjects watch and listen to another person speak.

  • Buchan, J.N., Paré, M., & Munhall, K.G. (in press). Spatial statistics of gaze fixations during dynamic face processing. Social Neuroscience.
  • Paré, M., Richler, R., ten Hove, M., & Munhall, K.G. (2003). Gaze Behavior in Audiovisual Speech Perception: The Influence of Ocular Fixations on the McGurk Effect.Perception and Psychophysics, 65, 553-567.
  • Vatikiotis-Bateson, E., Eigsti, I.M., Yano, S., & Munhall, K. (1998) Eye movement of perceivers during audiovisual speech perception. Perception and Psychophysics, 60(6), 926-940


3. The mechanisms underlying cross-modal integration. To study the way the perceptual system uses information from different sensory modalities we make use of an audiovisual illusion called the McGurk Effect. The McGurk Effect (McGurk and McDonald, 1976) occurs when conflicting consonant information is presented simultaneously to the visual and auditory modalities. When this is done a third and distinct consonant is perceived. In our studies, an audio /aba/ was dubbed onto a visual /aga/, with the resultant percept of /ada/. Our lab has manipulated timing and spatial variables within the McGurk paradigm.

  • Munhall, K.G. & Vatikiotis-Bateson, E. (2004). Spatial and temporal constraints on audiovisual speech perception. In G. Calvert, J. Spence, B. Stein (eds.) Handbook of Multisensory Processing. Cambridge, MA: MIT Press.
  • Callan, D., Jones, J.A., Munhall, K.G., Kroos, C., Callan, A. & Vatikiotis-Bateson, E. (2004). Multisensory-integration sites identified by perception of spatial wavelet filtered visual speech gesture information. Journal of Cognitive Neuroscience, 16, 805-816.
  • Munhall, K.G. , Gribble, P., Sacco, L., & Ward, M. (1996). Temporal constraints on the McGurk Effect. Perception and Psychophysics, 58, 351-362.
  • Jones, J. A. & Munhall, K. G. (1997) The effects of separating auditory and visual sources on audiovisual integration of speech. Canadian Acoustics, 25(4)13-19.
  • Munhall, K.G. & Tohkura, Y. (1998) Audiovisual gating and the time course of speech perception. Journal of the Acoustical Society of America, 104, 530-539.


Kingston, Ontario, Canada. K7L 3N6. 613.533.2000