Are real tongue movements easier to speech read than synthesized?
2009 (English)In: INTERSPEECH 2009: 10TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2009, BAIXAS: ISCA-INST SPEECH COMMUNICATION ASSOC , 2009, p. 824-827Conference paper, Published paper (Refereed)
Abstract [en]
Speech perception studies with augmented reality displays in talking heads have shown that tongue reading abilities are weak initially, but that subjects become able to extract some information from intra-oral visualizations after a short training session. In this study, we investigate how the nature of the tongue movements influences the results, by comparing synthetic rule-based and actual, measured movements. The subjects were significantly better at perceiving sentences accompanied by real movements, indicating that the current coarticulation model developed for facial movements is not optimal for the tongue.
Place, publisher, year, edition, pages
BAIXAS: ISCA-INST SPEECH COMMUNICATION ASSOC , 2009. p. 824-827
Keywords [en]
multimodal speech perception, augmented reality, visual speech synthesis
National Category
Computer and Information Sciences Communication Studies General Language Studies and Linguistics
Identifiers
URN: urn:nbn:se:kth:diva-29881ISI: 000276842800206Scopus ID: 2-s2.0-70450207970OAI: oai:DiVA.org:kth-29881DiVA, id: diva2:399049
Conference
10th INTERSPEECH 2009 Conference, Brighton, ENGLAND, SEP 06-10, 2009
Note
QC 20110221
2011-02-212011-02-172022-06-25Bibliographically approved