Looking at tongues – can it help in speech perception?
2008 (English)In: Proceedings The XXIst Swedish Phonetics Conference, FONETIK 2008, 2008, 57-60 p.Conference paper (Other academic)
This paper describes the contribution to speech perception given by animations of intra-oral articulations. 18 subjects were asked to identify the words in acoustically degraded sentences in three different presentation modes: acoustic signal only, audiovisual with a front view of a synthetic face and an audiovisual with both front face view and a side view, where tongue movements were visible by making parts of the cheek transparent. The augmented reality sideview did not help subjects perform better overall than with the front view only, but it seems to have been beneficial for the perception of palatal plosives, liquids and rhotics, especially in clusters.
Place, publisher, year, edition, pages
2008. 57-60 p.
Computer Science Language Technology (Computational Linguistics)
IdentifiersURN: urn:nbn:se:kth:diva-52040OAI: oai:DiVA.org:kth-52040DiVA: diva2:465334
Fonetik 2008, The XXIst Swedish Phonetics Conference. University of Gothenburg, Sweden. June 11–13, 2008
QC 20120109. tmh_import_11_12_142011-12-142011-12-142012-01-09Bibliographically approved