Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Bayesian Estimation of Beta Mixture Models with Variational Inference
KTH, School of Electrical Engineering (EES), Sound and Image Processing.
KTH, School of Electrical Engineering (EES), Sound and Image Processing.
2011 (English)In: IEEE Transaction on Pattern Analysis and Machine Intelligence, ISSN 0162-8828, E-ISSN 1939-3539, Vol. 33, no 11, 2160-2173 p.Article in journal (Refereed) Published
Abstract [en]

Bayesian estimation of the parameters in beta mixture models (BMM) is analytically intractable. The numerical solutionsto simulate the posterior distribution are available, but incur high computational cost. In this paper, we introduce an approximation tothe prior/posterior distribution of the parameters in the beta distribution and propose an analytically tractable (closed-form) Bayesianapproach to the parameter estimation. The approach is based on the variational inference (VI) framework. Following the principles ofthe VI framework and utilizing the relative convexity bound, the extended factorized approximation method is applied to approximate thedistribution of the parameters in BMM. In a fully Bayesian model where all the parameters of the BMM are considered as variables andassigned proper distributions, our approach can asymptotically find the optimal estimate of the parameters posterior distribution. Also,the model complexity can be determined based on the data. The closed-form solution is proposed so that no iterative numericalcalculation is required. Meanwhile, our approach avoids the drawback of overfitting in the conventional expectation maximizationalgorithm. The good performance of this approach is verified by experiments with both synthetic and real data.

Place, publisher, year, edition, pages
2011. Vol. 33, no 11, 2160-2173 p.
Keyword [en]
Bayesian Estimation, Maximum Likelihood Estimation, Beta Distribution, Mixture Modeling, Variational Inference, Factorized Approximation
National Category
Computer and Information Science
Research subject
SRA - ICT
Identifiers
URN: urn:nbn:se:kth:diva-33677DOI: 10.1109/TPAMI.2011.63ISI: 000294910000004Scopus ID: 2-s2.0-80053127168OAI: oai:DiVA.org:kth-33677DiVA: diva2:416992
Note
QC 20110929Available from: 2011-05-16 Created: 2011-05-13 Last updated: 2017-12-11Bibliographically approved
In thesis
1. Non-Gaussian Statistical Modelsand Their Applications
Open this publication in new window or tab >>Non-Gaussian Statistical Modelsand Their Applications
2011 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Statistical modeling plays an important role in various research areas. It provides away to connect the data with the statistics. Based on the statistical properties of theobserved data, an appropriate model can be chosen that leads to a promising practicalperformance. The Gaussian distribution is the most popular and dominant probabilitydistribution used in statistics, since it has an analytically tractable Probability DensityFunction (PDF) and analysis based on it can be derived in an explicit form. However,various data in real applications have bounded support or semi-bounded support. As the support of the Gaussian distribution is unbounded, such type of data is obviously notGaussian distributed. Thus we can apply some non-Gaussian distributions, e.g., the betadistribution, the Dirichlet distribution, to model the distribution of this type of data.The choice of a suitable distribution is favorable for modeling efficiency. Furthermore,the practical performance based on the statistical model can also be improved by a bettermodeling.

An essential part in statistical modeling is to estimate the values of the parametersin the distribution or to estimate the distribution of the parameters, if we consider themas random variables. Unlike the Gaussian distribution or the corresponding GaussianMixture Model (GMM), a non-Gaussian distribution or a mixture of non-Gaussian dis-tributions does not have an analytically tractable solution, in general. In this dissertation,we study several estimation methods for the non-Gaussian distributions. For the Maxi-mum Likelihood (ML) estimation, a numerical method is utilized to search for the optimalsolution in the estimation of Dirichlet Mixture Model (DMM). For the Bayesian analysis,we utilize some approximations to derive an analytically tractable solution to approxi-mate the distribution of the parameters. The Variational Inference (VI) framework basedmethod has been shown to be efficient for approximating the parameter distribution byseveral researchers. Under this framework, we adapt the conventional Factorized Approx-imation (FA) method to the Extended Factorized Approximation (EFA) method and useit to approximate the parameter distribution in the beta distribution. Also, the LocalVariational Inference (LVI) method is applied to approximate the predictive distributionof the beta distribution. Finally, by assigning a beta distribution to each element in thematrix, we proposed a variational Bayesian Nonnegative Matrix Factorization (NMF) forbounded support data.

The performances of the proposed non-Gaussian model based methods are evaluatedby several experiments. The beta distribution and the Dirichlet distribution are appliedto model the Line Spectral Frequency (LSF) representation of the Linear Prediction (LP)model for statistical model based speech coding. For some image processing applications,the beta distribution is also applied. The proposed beta distribution based variationalBayesian NMF is applied for image restoration and collaborative filtering. Comparedto some conventional statistical model based methods, the non-Gaussian model basedmethods show a promising improvement.

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2011. xii, 49 p.
Series
Trita-EE, ISSN 1653-5146
National Category
Telecommunications Computer and Information Science
Identifiers
urn:nbn:se:kth:diva-47408 (URN)978-91-7501-158-5 (ISBN)
Public defence
2011-12-05, E1, Lindstedsvägen 3, KTH, Stockholm, 09:00 (English)
Opponent
Supervisors
Note
QC 20111115Available from: 2011-11-15 Created: 2011-11-08 Last updated: 2011-11-15Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Ma, ZhanyuLeijon, Arne
By organisation
Sound and Image Processing
In the same journal
IEEE Transaction on Pattern Analysis and Machine Intelligence
Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 1182 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf