Skip to main content

The effects of auditory, visual, and gestural information on the perception on Mandarin tones

Resource type
Thesis type
(Thesis) M.A.
Date created
2017-08-04
Authors/Contributors
Abstract
In multimodal speech perception, strategic connections between auditory and visual- spatial events can aid in the disambiguation of speech sounds. This study examines how co-speech hand gestures mimicking pitch contours in space affect non-native Mandarin tone perception. Native English as well as Mandarin perceivers identified tones with either congruent (C) or incongruent (I) Audio+Face (AF) and Audio+Face+Gesture (AFG) input. Mandarin perceivers performed at ceiling rates in the Congruent conditions, but showed a partially gesture-based response in AFG-I, revealing that gestures were perceived as valid cues for tone. The English group’s performance was better in congruent than incongruent AF and AFG conditions. Their identification rates were also highly skewed towards the visual tone when gesture was presented in the AFG compared to AF conditions. These results indicate positive effects of facial and especially gestural input on non-native tone perception, suggesting that crossmodal resources can be recruited to aid auditory perception when phonetic demands are high.
Document
Identifier
etd10284
Copyright statement
Copyright is held by the author.
Permissions
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Scholarly level
Supervisor or Senior Supervisor
Thesis advisor: Wang, Yue
Member of collection
Download file Size
etd10284_BHannah.pdf 1.35 MB

Views & downloads - as of June 2023

Views: 17
Downloads: 1