Is there a McGurk effect for tongue reading?
2010 (English)In: Proceedings of AVSP: International Conferenceon Audio-Visual Speech Processing, 2010Conference paper (Refereed)
Previous studies on tongue reading, i.e., speech perception ofdegraded audio supported by animations of tongue movementshave indicated that the support is weak initially and that subjectsneed training to learn to interpret the movements. Thispaper investigates if the learning is of the animation templatesas such or if subjects learn to retrieve articulatory knowledgethat they already have. Matching and conflicting animationsof tongue movements were presented randomly together withthe auditory speech signal at three different levels of noise in aconsonant identification test. The average recognition rate overthe three noise levels was significantly higher for the matchedaudiovisual condition than for the conflicting and the auditoryonly. Audiovisual integration effects were also found for conflictingstimuli. However, the visual modality is given much lessweight in the perception than for a normal face view, and intersubjectdifferences in the use of visual information are large.
Place, publisher, year, edition, pages
McGurk, audiovisual speech perception, augmented reality
Computer Science Language Technology (Computational Linguistics)
IdentifiersURN: urn:nbn:se:kth:diva-52167OAI: oai:DiVA.org:kth-52167DiVA: diva2:465462
Auditory-Visual Speech Processing (AVSP) 2010. Hakone, Kanagawa, Japan. September 30-October 3, 2010
QC 20120111. tmh_import_11_12_142011-12-142011-12-142016-05-25Bibliographically approved