An Investigation on Mutual Information for the Linear Predictive System and the Extrapolation of Speech Signals.
2012 (English)In: Speech Communication; 10. ITG Symposium; Proceedings of, 2012, 1-4 p.Conference paper (Refereed)
Mutual information (MI) is an important information theoretic concept which has many applications in telecommunications, in blind source separation, and in machine learning. More recently, it has been also employed for the instrumental assessment of speech intelligibility where traditionally correlation based measures are used. In this paper, we address the difference between MI and correlation from the viewpoint of discovering dependencies between variables in the context of speech signals. We perform our investigation by considering the linear predictive approximation and the extrapolation of speech signals as examples. We compare a parametric MI estimation approach based on a Gaussian mixture model (GMM) with the knearest neighbor (KNN) approach which is a well-known non-parametric method available to estimate the MI. We show that the GMM-based MI estimator leads to more consistent results.
Place, publisher, year, edition, pages
2012. 1-4 p.
Electrical Engineering, Electronic Engineering, Information Engineering
IdentifiersURN: urn:nbn:se:kth:diva-105282OAI: oai:DiVA.org:kth-105282DiVA: diva2:570594
ITG Conference on Speech Communication
QC 201305242012-11-202012-11-202013-05-24Bibliographically approved