Please wait ... |

Link to record
http://kth.diva-portal.org/smash/person.jsf?pid=authority-person:32352 $(function(){PrimeFaces.cw("InputTextarea","widget_formSmash_upper_j_idt129_recordDirectLink",{id:"formSmash:upper:j_idt129:recordDirectLink",widgetVar:"widget_formSmash_upper_j_idt129_recordDirectLink",autoResize:true});}); $(function(){PrimeFaces.cw("OverlayPanel","widget_formSmash_upper_j_idt129_j_idt131",{id:"formSmash:upper:j_idt129:j_idt131",widgetVar:"widget_formSmash_upper_j_idt129_j_idt131",target:"formSmash:upper:j_idt129:permLink",showEffect:"blind",hideEffect:"fade",my:"right top",at:"right bottom",showCloseIcon:true});});

Permanent link

Direct link

Olsson, Jimmyorcid.org/0000-0003-0772-846X

Open this publication in new window or tab >>Particle-based online estimation of tangent filters with application to parameter estimation in nonlinear state-space models### Olsson, Jimmy

### Westerborn, Johan

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_0_j_idt195_some",{id:"formSmash:j_idt191:0:j_idt195:some",widgetVar:"widget_formSmash_j_idt191_0_j_idt195_some",multiple:true}); PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_0_j_idt195_otherAuthors",{id:"formSmash:j_idt191:0:j_idt195:otherAuthors",widgetVar:"widget_formSmash_j_idt191_0_j_idt195_otherAuthors",multiple:true}); 2020 (English)In: Annals of the Institute of Statistical Mathematics, ISSN 0020-3157, E-ISSN 1572-9052, Vol. 72, no 2, p. 545-576Article in journal (Refereed) Published
##### Abstract [en]

##### Place, publisher, year, edition, pages

SPRINGER HEIDELBERG, 2020
##### Keywords

Parameter estimation, Recursive maximum likelihood, State-space models, Tangent filter, Sequential Monte Carlo methods, Central limit theorem, Particle filters
##### National Category

Control Engineering
##### Identifiers

urn:nbn:se:kth:diva-270878 (URN)10.1007/s10463-018-0698-1 (DOI)000515352900009 ()2-s2.0-85056818631 (Scopus ID)
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_0_j_idt195_j_idt366",{id:"formSmash:j_idt191:0:j_idt195:j_idt366",widgetVar:"widget_formSmash_j_idt191_0_j_idt195_j_idt366",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_0_j_idt195_j_idt372",{id:"formSmash:j_idt191:0:j_idt195:j_idt372",widgetVar:"widget_formSmash_j_idt191_0_j_idt195_j_idt372",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_0_j_idt195_j_idt378",{id:"formSmash:j_idt191:0:j_idt195:j_idt378",widgetVar:"widget_formSmash_j_idt191_0_j_idt195_j_idt378",multiple:true});
#####

##### Note

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.

This paper presents a novel algorithm for efficient online estimation of the filter derivatives in general hidden Markov models. The algorithm, which has a linear computational complexity and very limited memory requirements, is furnished with a number of convergence results, including a central limit theorem with an asymptotic variance that can be shown to be uniformly bounded in time. Using the proposed filter derivative estimator, we design a recursive maximum likelihood algorithm updating the parameters according the gradient of the one-step predictor log-likelihood. The efficiency of this online parameter estimation scheme is illustrated in a simulation study.

QC 20200325

Available from: 2020-03-25 Created: 2020-03-25 Last updated: 2020-03-25Bibliographically approvedOpen this publication in new window or tab >>Posterior consistency for partially observed Markov models### Douc, Randal

### Olsson, Jimmy

### Roueff, Francois

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_1_j_idt195_some",{id:"formSmash:j_idt191:1:j_idt195:some",widgetVar:"widget_formSmash_j_idt191_1_j_idt195_some",multiple:true}); PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_1_j_idt195_otherAuthors",{id:"formSmash:j_idt191:1:j_idt195:otherAuthors",widgetVar:"widget_formSmash_j_idt191_1_j_idt195_otherAuthors",multiple:true}); 2020 (English)In: Stochastic Processes and their Applications, ISSN 0304-4149, E-ISSN 1879-209X, Vol. 132, no 2, p. 733-759Article in journal (Refereed) Published
##### Abstract [en]

##### Place, publisher, year, edition, pages

ELSEVIER, 2020
##### National Category

Probability Theory and Statistics
##### Identifiers

urn:nbn:se:kth:diva-267740 (URN)10.1016/j.spa.2019.03.012 (DOI)000509814500014 ()2-s2.0-85064636272 (Scopus ID)
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_1_j_idt195_j_idt366",{id:"formSmash:j_idt191:1:j_idt195:j_idt366",widgetVar:"widget_formSmash_j_idt191_1_j_idt195_j_idt366",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_1_j_idt195_j_idt372",{id:"formSmash:j_idt191:1:j_idt195:j_idt372",widgetVar:"widget_formSmash_j_idt191_1_j_idt195_j_idt372",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_1_j_idt195_j_idt378",{id:"formSmash:j_idt191:1:j_idt195:j_idt378",widgetVar:"widget_formSmash_j_idt191_1_j_idt195_j_idt378",multiple:true});
#####

##### Note

Univ Paris Saclay, Telecom SudParis, CNRS, UMR 5157, 9 Rue Charles Fourier, F-91000 Evry, France..

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.

Univ Paris Saclay, Telecom ParisTech, LTCI, 46 Rue Barrault, F-75013 Paris, France..

We establish the posterior consistency for parametric, partially observed, fully dominated Markov models. The prior is assumed to assign positive probability to all neighborhoods of the true parameter, for a distance induced by the expected Kullback-Leibler divergence between the parametric family members' Markov transition densities. This assumption is easily checked in general. In addition, we show that the posterior consistency is implied by the consistency of the maximum likelihood estimator. The result is extended to possibly improper priors and non-stationary observations. Finally, we check our assumptions on a linear Gaussian model and a well-known stochastic volatility model.

QC 20200218

Available from: 2020-02-18 Created: 2020-02-18 Last updated: 2020-02-18Bibliographically approvedOpen this publication in new window or tab >>Bayesian learning of weakly structural Markov graph laws using sequential Monte Carlo methods### Olsson, Jimmy

### Pavlenko, Tatjana

### Rios, Felix Leopoldo

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_2_j_idt195_some",{id:"formSmash:j_idt191:2:j_idt195:some",widgetVar:"widget_formSmash_j_idt191_2_j_idt195_some",multiple:true}); PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_2_j_idt195_otherAuthors",{id:"formSmash:j_idt191:2:j_idt195:otherAuthors",widgetVar:"widget_formSmash_j_idt191_2_j_idt195_otherAuthors",multiple:true}); 2019 (English)In: Electronic Journal of Statistics, ISSN 1935-7524, E-ISSN 1935-7524, Vol. 13, no 2, p. 2865-2897Article in journal (Refereed) Published
##### Abstract [en]

##### Place, publisher, year, edition, pages

INST MATHEMATICAL STATISTICS, 2019
##### Keywords

Structure learning, sequential sampling, decomposable graphical models, particle Gibbs
##### National Category

Probability Theory and Statistics
##### Identifiers

urn:nbn:se:kth:diva-266907 (URN)10.1214/19-EJS1585 (DOI)000505695800015 ()2-s2.0-85073362032 (Scopus ID)
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_2_j_idt195_j_idt366",{id:"formSmash:j_idt191:2:j_idt195:j_idt366",widgetVar:"widget_formSmash_j_idt191_2_j_idt195_j_idt366",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_2_j_idt195_j_idt372",{id:"formSmash:j_idt191:2:j_idt195:j_idt372",widgetVar:"widget_formSmash_j_idt191_2_j_idt195_j_idt372",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_2_j_idt195_j_idt378",{id:"formSmash:j_idt191:2:j_idt195:j_idt378",widgetVar:"widget_formSmash_j_idt191_2_j_idt195_j_idt378",multiple:true});
#####

##### Note

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.

We present a sequential sampling methodology for weakly structural Markov laws, arising naturally in a Bayesian structure learning context for decomposable graphical models. As a key component of our suggested approach, we show that the problem of graph estimation, which in general lacks natural sequential interpretation, can be recast into a sequential setting by proposing a recursive Feynman-Kac model that generates a flow of junction tree distributions over a space of increasing dimensions. We focus on particle McMC methods to provide samples on this space, in particular on particle Gibbs (PG), as it allows for generating McMC chains with global moves on an underlying space of decomposable graphs. To further improve the PG mixing properties, we incorporate a systematic refreshment step implemented through direct sampling from a backward kernel. The theoretical properties of the algorithm are investigated, showing that the proposed refreshment step improves the performance in terms of asymptotic variance of the estimated distribution. The suggested sampling methodology is illustrated through a collection of numerical examples demonstrating high accuracy in Bayesian graph structure learning in both discrete and continuous graphical models.

QC 20200322

Available from: 2020-03-22 Created: 2020-03-22 Last updated: 2020-03-22Bibliographically approvedOpen this publication in new window or tab >>Bayesian learning of weakly structural Markov graph laws using sequential Monte Carlo methods### Olsson, Jimmy

### Pavlenko, Tatjana

### Rios, Felix Leopoldo

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_3_j_idt195_some",{id:"formSmash:j_idt191:3:j_idt195:some",widgetVar:"widget_formSmash_j_idt191_3_j_idt195_some",multiple:true}); PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_3_j_idt195_otherAuthors",{id:"formSmash:j_idt191:3:j_idt195:otherAuthors",widgetVar:"widget_formSmash_j_idt191_3_j_idt195_otherAuthors",multiple:true}); 2019 (English)In: Electronic Journal of Statistics, ISSN 1935-7524, E-ISSN 1935-7524, Vol. 13, no 2, p. 2865-2897Article in journal (Refereed) Published
##### Abstract [en]

##### Place, publisher, year, edition, pages

Institute of Mathematical Statistics, 2019
##### Keywords

Decomposable graphical models, Particle gibbs, Sequential sampling, Structure learning
##### National Category

Mathematics
##### Identifiers

urn:nbn:se:kth:diva-268610 (URN)10.1214/19-EJS1585 (DOI)000505695800015 ()2-s2.0-85073362032 (Scopus ID)
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_3_j_idt195_j_idt366",{id:"formSmash:j_idt191:3:j_idt195:j_idt366",widgetVar:"widget_formSmash_j_idt191_3_j_idt195_j_idt366",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_3_j_idt195_j_idt372",{id:"formSmash:j_idt191:3:j_idt195:j_idt372",widgetVar:"widget_formSmash_j_idt191_3_j_idt195_j_idt372",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_3_j_idt195_j_idt378",{id:"formSmash:j_idt191:3:j_idt195:j_idt378",widgetVar:"widget_formSmash_j_idt191_3_j_idt195_j_idt378",multiple:true});
#####

##### Funder

Swedish Research Council, C0595201, 2018-05230
##### Note

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.

We present a sequential sampling methodology for weakly structural Markov laws, arising naturally in a Bayesian structure learning context for decomposable graphical models. As a key component of our suggested approach, we show that the problem of graph estimation, which in general lacks natural sequential interpretation, can be recast into a sequential setting by proposing a recursive Feynman-Kac model that generates a flow of junction tree distributions over a space of increasing dimensions. We focus on particle McMC methods to provide samples on this space, in particular on particle Gibbs (PG), as it allows for generating McMC chains with global moves on an underlying space of decomposable graphs. To further improve the PG mixing properties, we incorporate a systematic refreshment step implemented through direct sampling from a backward kernel. The theoretical properties of the algorithm are investigated, showing that the proposed refreshment step improves the performance in terms of asymptotic variance of the estimated distribution. The suggested sampling methodology is illustrated through a collection of numerical examples demonstrating high accuracy in Bayesian graph structure learning in both discrete and continuous graphical models.

QC 20200505

Available from: 2020-05-05 Created: 2020-05-05 Last updated: 2020-05-05Bibliographically approvedOpen this publication in new window or tab >>Numerically stable online estimation of variance in particle filters### Olsson, Jimmy

### Douc, And Randal

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_4_j_idt195_some",{id:"formSmash:j_idt191:4:j_idt195:some",widgetVar:"widget_formSmash_j_idt191_4_j_idt195_some",multiple:true}); PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_4_j_idt195_otherAuthors",{id:"formSmash:j_idt191:4:j_idt195:otherAuthors",widgetVar:"widget_formSmash_j_idt191_4_j_idt195_otherAuthors",multiple:true}); 2019 (English)In: Bernoulli, ISSN 1350-7265, E-ISSN 1573-9759, Vol. 25, no 2, p. 1504-1535Article in journal (Refereed) Published
##### Abstract [en]

##### Place, publisher, year, edition, pages

INT STATISTICAL INST, 2019
##### Keywords

asymptotic variance, Feynman-Kac models, hidden Markov models, particle filters, sequential Monte Carlo methods, state-space models, variance estimation
##### National Category

Mathematics
##### Identifiers

urn:nbn:se:kth:diva-247800 (URN)10.3150/18-BEJ1028 (DOI)000460420300024 ()2-s2.0-85064056057 (Scopus ID)
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_4_j_idt195_j_idt366",{id:"formSmash:j_idt191:4:j_idt195:j_idt366",widgetVar:"widget_formSmash_j_idt191_4_j_idt195_j_idt366",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_4_j_idt195_j_idt372",{id:"formSmash:j_idt191:4:j_idt195:j_idt372",widgetVar:"widget_formSmash_j_idt191_4_j_idt195_j_idt372",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_4_j_idt195_j_idt378",{id:"formSmash:j_idt191:4:j_idt195:j_idt378",widgetVar:"widget_formSmash_j_idt191_4_j_idt195_j_idt378",multiple:true});
#####

##### Note

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.).

TELECOM SudParis, Dept CITI, 9 Rue Charles Fourier, F-91000 Evry, France..

This paper discusses variance estimation in sequential Monte Carlo methods, alternatively termed particle filters. The variance estimator that we propose is a natural modification of that suggested by H.P. Chan and T.L. Lai [Ann. Statist. 41 (2013) 2877-2904], which allows the variance to be estimated in a single run of the particle filter by tracing the genealogical history of the particles. However, due particle lineage degeneracy, the estimator of the mentioned work becomes numerically unstable as the number of sequential particle updates increases. Thus, by tracing only a part of the particles' genealogy rather than the full one, our estimator gains long-term numerical stability at the cost of a bias. The scope of the genealogical tracing is regulated by a lag, and under mild, easily checked model assumptions, we prove that the bias tends to zero geometrically fast as the lag increases. As confirmed by our numerical results, this allows the bias to be tightly controlled also for moderate particle sample sizes.

QC 20190401

Available from: 2019-04-01 Created: 2019-04-01 Last updated: 2019-05-16Bibliographically approvedOpen this publication in new window or tab >>Particle-Based Adaptive-Lag Online Marginal Smoothing in General State-Space Models### Alenlov, Johan

### Olsson, Jimmy

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_5_j_idt195_some",{id:"formSmash:j_idt191:5:j_idt195:some",widgetVar:"widget_formSmash_j_idt191_5_j_idt195_some",multiple:true}); PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_5_j_idt195_otherAuthors",{id:"formSmash:j_idt191:5:j_idt195:otherAuthors",widgetVar:"widget_formSmash_j_idt191_5_j_idt195_otherAuthors",multiple:true}); 2019 (English)In: IEEE Transactions on Signal Processing, ISSN 1053-587X, E-ISSN 1941-0476, Vol. 67, no 21, p. 5571-5582Article in journal (Refereed) Published
##### Abstract [en]

##### Place, publisher, year, edition, pages

IEEE, 2019
##### Keywords

Smoothing methods, Approximation algorithms, Markov processes, Signal processing algorithms, Monte Carlo methods, Hidden Markov models, Biological system modeling, Sequential Monte Carlo methods, state-space models, marginal smoothing, PaRIS, particle filters, state estimation
##### National Category

Mathematics
##### Identifiers

urn:nbn:se:kth:diva-263673 (URN)10.1109/TSP.2019.2941066 (DOI)000492374000002 ()2-s2.0-85077749795 (Scopus ID)
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_5_j_idt195_j_idt366",{id:"formSmash:j_idt191:5:j_idt195:j_idt366",widgetVar:"widget_formSmash_j_idt191_5_j_idt195_j_idt366",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_5_j_idt195_j_idt372",{id:"formSmash:j_idt191:5:j_idt195:j_idt372",widgetVar:"widget_formSmash_j_idt191_5_j_idt195_j_idt372",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_5_j_idt195_j_idt378",{id:"formSmash:j_idt191:5:j_idt195:j_idt378",widgetVar:"widget_formSmash_j_idt191_5_j_idt195_j_idt378",multiple:true});
#####

##### Note

Uppsala Univ, Dept Informat Technol, S-75236 Uppsala, Sweden..

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.

We present a novel algorithm, an adaptive-lag smoother, approximating efficiently, in an online fashion, sequences of expectations under the marginal smoothing distributions in general state-space models. The algorithm evolves recursively a bank of estimators, one for each marginal, in resemblance with the so-called particle-based, rapid incremental smoother (PaRIS). Each estimator is propagated until a stopping criterion, measuring the fluctuations of the estimates, is met. The presented algorithm is furnished with theoretical results describing its asymptotic limit and memory usage.

QC 20191108

Available from: 2019-11-08 Created: 2019-11-08 Last updated: 2020-03-09Bibliographically approvedOpen this publication in new window or tab >>Efficient particle-based online smoothing in general hidden Markov models: The PaRIS algorithm### Olsson, Jimmy

### Westerborn, Johan

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_6_j_idt195_some",{id:"formSmash:j_idt191:6:j_idt195:some",widgetVar:"widget_formSmash_j_idt191_6_j_idt195_some",multiple:true}); PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_6_j_idt195_otherAuthors",{id:"formSmash:j_idt191:6:j_idt195:otherAuthors",widgetVar:"widget_formSmash_j_idt191_6_j_idt195_otherAuthors",multiple:true}); 2017 (English)In: Bernoulli, ISSN 1350-7265, E-ISSN 1573-9759, Vol. 23, no 3, p. 1951-1996Article in journal (Refereed) Published
##### Abstract [en]

##### Place, publisher, year, edition, pages

INT STATISTICAL INST, 2017
##### Keywords

central limit theorem, general hidden Markov models, Hoeffding-type inequality, online estimation, particle filter, particle path degeneracy, sequential Monte Carlo, smoothing
##### National Category

Mathematics
##### Identifiers

urn:nbn:se:kth:diva-206226 (URN)10.3150/16-BEJ801 (DOI)000398013100017 ()2-s2.0-85016201786 (Scopus ID)
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_6_j_idt195_j_idt366",{id:"formSmash:j_idt191:6:j_idt195:j_idt366",widgetVar:"widget_formSmash_j_idt191_6_j_idt195_j_idt366",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_6_j_idt195_j_idt372",{id:"formSmash:j_idt191:6:j_idt195:j_idt372",widgetVar:"widget_formSmash_j_idt191_6_j_idt195_j_idt372",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_6_j_idt195_j_idt378",{id:"formSmash:j_idt191:6:j_idt195:j_idt378",widgetVar:"widget_formSmash_j_idt191_6_j_idt195_j_idt378",multiple:true});
#####

##### Note

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.

This paper presents a novel algorithm, the particle-based, rapid incremental smoother (PaRIS), for efficient online approximation of smoothed expectations of additive state functionals in general hidden Markov models. The algorithm, which has a linear computational complexity under weak assumptions and very limited memory requirements, is furnished with a number of convergence results, including a central limit theorem. An interesting feature of PaRIS, which samples on-the-fly from the retrospective dynamics induced by the particle filter, is that it requires two or more backward draws per particle in order to cope with degeneracy of the sampled trajectories and to stay numerically stable in the long run with an asymptotic variance that grows only linearly with time.

QC 20170517

Available from: 2017-05-17 Created: 2017-05-17 Last updated: 2017-10-06Bibliographically approvedOpen this publication in new window or tab >>On the use of Markov chain Monte Carlo methods for the sampling of mixture models: a statistical perspective### Douc, Randal

### Maire, Florian

### Olsson, Jimmy

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_7_j_idt195_some",{id:"formSmash:j_idt191:7:j_idt195:some",widgetVar:"widget_formSmash_j_idt191_7_j_idt195_some",multiple:true}); PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_7_j_idt195_otherAuthors",{id:"formSmash:j_idt191:7:j_idt195:otherAuthors",widgetVar:"widget_formSmash_j_idt191_7_j_idt195_otherAuthors",multiple:true}); 2015 (English)In: Statistics and computing, ISSN 0960-3174, E-ISSN 1573-1375, Vol. 25, no 1, p. 95-110Article in journal (Refereed) Published
##### Abstract [en]

##### Keywords

Asymptotic variance, Carlin & Chib's pseudo-prior method, Inhomogeneous Markov chains, Metropolisation, Mixture models, Peskun ordering
##### National Category

Computer Sciences Mathematics
##### Identifiers

urn:nbn:se:kth:diva-161641 (URN)10.1007/s11222-014-9526-5 (DOI)000349028500015 ()2-s2.0-84925482882 (Scopus ID)
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_7_j_idt195_j_idt366",{id:"formSmash:j_idt191:7:j_idt195:j_idt366",widgetVar:"widget_formSmash_j_idt191_7_j_idt195_j_idt366",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_7_j_idt195_j_idt372",{id:"formSmash:j_idt191:7:j_idt195:j_idt372",widgetVar:"widget_formSmash_j_idt191_7_j_idt195_j_idt372",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_7_j_idt195_j_idt378",{id:"formSmash:j_idt191:7:j_idt195:j_idt378",widgetVar:"widget_formSmash_j_idt191_7_j_idt195_j_idt378",multiple:true});
#####

##### Note

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.).

In this paper we study asymptotic properties of different data-augmentation-type Markov chain Monte Carlo algorithms sampling from mixture models comprising discrete as well as continuous random variables. Of particular interest to us is the situation where sampling from the conditional distribution of the continuous component given the discrete component is infeasible. In this context, we advance Carlin & Chib's pseudo-prior method as an alternative way of infering mixture models and discuss and compare different algorithms based on this scheme. We propose a novel algorithm, the Frozen Carlin & Chib sampler, which is computationally less demanding than any Metropolised Carlin & Chib-type algorithm. The significant gain of computational efficiency is however obtained at the cost of some asymptotic variance. The performance of the algorithm vis-A -vis alternative schemes is, using some recent results obtained in Maire et al. (Ann Stat 42: 1483-1510, 2014) for inhomogeneous Markov chains evolving alternatingly according to two different -reversible Markov transition kernels, investigated theoretically as well as numerically.

QC 20150317

Available from: 2015-03-17 Created: 2015-03-13 Last updated: 2018-01-11Bibliographically approvedOpen this publication in new window or tab >>Comparison of asymptotic variances of inhomogeneous Markov chains with application to Markov chain Monte Carlo methods### Maire, Florian

### Douc, Randal

### Olsson, Jimmy

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_8_j_idt195_some",{id:"formSmash:j_idt191:8:j_idt195:some",widgetVar:"widget_formSmash_j_idt191_8_j_idt195_some",multiple:true}); PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_8_j_idt195_otherAuthors",{id:"formSmash:j_idt191:8:j_idt195:otherAuthors",widgetVar:"widget_formSmash_j_idt191_8_j_idt195_otherAuthors",multiple:true}); 2014 (English)In: Annals of Statistics, ISSN 0090-5364, E-ISSN 2168-8966, Vol. 42, no 4, p. 1483-1510Article in journal (Refereed) Published
##### Abstract [en]

##### Place, publisher, year, edition, pages

Institute of Mathematical Statistics, 2014
##### Keywords

Markov chain Monte Carlo, asymptotic variance, Peskun ordering, inhomogeneous Markov chains, pseudo-marginal algorithms
##### National Category

Probability Theory and Statistics
##### Identifiers

urn:nbn:se:kth:diva-154769 (URN)10.1214/14-AOS1209 (DOI)000342481700009 ()2-s2.0-84987968040 (Scopus ID)
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_8_j_idt195_j_idt366",{id:"formSmash:j_idt191:8:j_idt195:j_idt366",widgetVar:"widget_formSmash_j_idt191_8_j_idt195_j_idt366",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_8_j_idt195_j_idt372",{id:"formSmash:j_idt191:8:j_idt195:j_idt372",widgetVar:"widget_formSmash_j_idt191_8_j_idt195_j_idt372",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_8_j_idt195_j_idt378",{id:"formSmash:j_idt191:8:j_idt195:j_idt378",widgetVar:"widget_formSmash_j_idt191_8_j_idt195_j_idt378",multiple:true});
#####

##### Funder

Swedish Research Council, 2011-5577
##### Note

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.).

In this paper, we study the asymptotic variance of sample path averages for inhomogeneous Markov chains that evolve alternatingly according to two different 7-reversible Markov transition kernels P and Q. More specifically, our main result allows us to compare directly the asymptotic variances of two inhomogeneous Markov chains associated with different kernels Pi and Q(i), i is an element of {0, 1}, as soon as the kernels of each pair (P-0, P-1) and (Q(0), Q(1)) can be ordered in the sense of lag-one autocovariance. As an important application, we use this result for comparing different data-augmentation-type Metropolis Hastings algorithms. In particular, we compare some pseudo-marginal algorithms and propose a novel exact algorithm, referred to as the random refreshment algorithm, which is more efficient, in terms of asymptotic variance, than the Grouped Independence Metropolis Hastings algorithm and has a computational complexity that does not exceed that of the Monte Carlo Within Metropolis algorithm.

QC 20141103

Available from: 2014-11-03 Created: 2014-10-27 Last updated: 2019-09-06Bibliographically approvedOpen this publication in new window or tab >>EFFICIENT PARTICLE-BASED ONLINE SMOOTHING IN GENERAL HIDDEN MARKOV MODELS### Westerborn, Johan

### Olsson, Jimmy

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_9_j_idt195_some",{id:"formSmash:j_idt191:9:j_idt195:some",widgetVar:"widget_formSmash_j_idt191_9_j_idt195_some",multiple:true}); PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_9_j_idt195_otherAuthors",{id:"formSmash:j_idt191:9:j_idt195:otherAuthors",widgetVar:"widget_formSmash_j_idt191_9_j_idt195_otherAuthors",multiple:true}); 2014 (English)In: Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, ISSN 1520-6149Article in journal (Refereed) Published
##### Abstract [en]

##### Keywords

Hidden Markov models, particle filters, smoothing methods, Monte Carlo methods, state estimation
##### National Category

Fluid Mechanics and Acoustics
##### Identifiers

urn:nbn:se:kth:diva-158346 (URN)10.1109/ICASSP.2014.6855159 (DOI)000343655308009 ()2-s2.0-84905270424 (Scopus ID)
##### Conference

IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), MAY 04-09, 2014, Florence, ITALY
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_9_j_idt195_j_idt366",{id:"formSmash:j_idt191:9:j_idt195:j_idt366",widgetVar:"widget_formSmash_j_idt191_9_j_idt195_j_idt366",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_9_j_idt195_j_idt372",{id:"formSmash:j_idt191:9:j_idt195:j_idt372",widgetVar:"widget_formSmash_j_idt191_9_j_idt195_j_idt372",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt191_9_j_idt195_j_idt378",{id:"formSmash:j_idt191:9:j_idt195:j_idt378",widgetVar:"widget_formSmash_j_idt191_9_j_idt195_j_idt378",multiple:true});
#####

##### Note

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.).

KTH, School of Engineering Sciences (SCI), Mathematics (Dept.).

This paper deals with the problem of estimating expectations of sums of additive functionals under the joint smoothing distribution in general hidden Markov models. Computing such expectations is a key ingredient in any kind of expectation-maximization-based parameter inference in models of this sort. The paper presents a computationally efficient algorithm for online estimation of these expectations in a forward manner. The proposed algorithm has a linear computational complexity in the number of particles and does not require old particles and weights to be stored during the computations. The algorithm avoids completely the well-known particle path degeneracy problem of the standard forward smoother. This makes it highly applicable within the framework of online expectation-maximization methods. The simulations show that the proposed algorithm provides the same precision as existing algorithms at a considerably lower computational cost.

QC 20150121

Available from: 2015-01-21 Created: 2015-01-07 Last updated: 2017-12-05Bibliographically approved