Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Minimum entropy rate simplification of stochastic processes
KTH, School of Electrical Engineering (EES), Automatic Control. KTH, School of Electrical Engineering (EES), Centres, ACCESS Linnaeus Centre. University of Edinburgh, United Kingdom.
2016 (English)In: IEEE Transaction on Pattern Analysis and Machine Intelligence, ISSN 0162-8828, E-ISSN 1939-3539, Vol. PP, no 99, 7416224Article in journal (Refereed) Published
Abstract [en]

We propose minimum entropy rate simplification (MERS), an information-theoretic, parameterization-independent framework for simplifying generative models of stochastic processes. Applications include improving model quality for sampling tasks by concentrating the probability mass on the most characteristic and accurately described behaviors while de-emphasizing the tails, and obtaining clean models from corrupted data (nonparametric denoising). This is the opposite of the smoothing step commonly applied to classification models. Drawing on rate-distortion theory, MERS seeks the minimum entropy-rate process under a constraint on the dissimilarity between the original and simplified processes. We particularly investigate the Kullback-Leibler divergence rate as a dissimilarity measure, where, compatible with our assumption that the starting model is disturbed or inaccurate, the simplification rather than the starting model is used for the reference distribution of the divergence. This leads to analytic solutions for stationary and ergodic Gaussian processes and Markov chains. The same formulas are also valid for maximum-entropy smoothing under the same divergence constraint. In experiments, MERS successfully simplifies and denoises models from audio, text, speech, and meteorology.

Place, publisher, year, edition, pages
IEEE, 2016. Vol. PP, no 99, 7416224
Keyword [en]
G.3.e Markov processes, G.3.p Stochastic processes, H.1.1.b Information theory, H.5.5.c Signal analysis, synthesis, and processing, I.2.7.b Language generation, I.5.1.e Statistical models, Electric distortion, Entropy, Image coding, Information theory, Markov processes, Random processes, Signal distortion, Signal theory, Stochastic systems, Analytic solution, Classification models, Dissimilarity measures, Divergence constraint, Gaussian Processes, Kullback Leibler divergence, Language generation, Rate distortion-theory, Stochastic models
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:kth:diva-195490DOI: 10.1109/TPAMI.2016.2533382ScopusID: 2-s2.0-84988424295OAI: oai:DiVA.org:kth-195490DiVA: diva2:1049786
Note

QC 20161125

Available from: 2016-11-25 Created: 2016-11-03 Last updated: 2016-11-25Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Henter, Gustav Eje
By organisation
Automatic ControlACCESS Linnaeus Centre
In the same journal
IEEE Transaction on Pattern Analysis and Machine Intelligence
Electrical Engineering, Electronic Engineering, Information Engineering

Search outside of DiVA

GoogleGoogle Scholar

Altmetric score

Total: 7 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf