Rao-Blackwellization of Particle Markov Chain Monte Carlo Methods Using Forward Filtering Backward Sampling
2011 (English)In: IEEE Transactions on Signal Processing, ISSN 1053-587X, E-ISSN 1941-0476, Vol. 59, no 10, 4606-4619 p.Article in journal (Refereed) Published
Smoothing in state-space models amounts to computing the conditional distribution of the latent state trajectory, given observations, or expectations of functionals of the state trajectory with respect to this distribution. In recent years there has been an increased interest in Monte Carlo-based methods, often involving particle filters, for approximate smoothing in nonlinear and/or non-Gaussian state-space models. One such method is to approximate filter distributions using a particle filter and then to simulate, using backward kernels, a state trajectory backwards on the set of particles. We show that by simulating multiple realizations of the particle filter and adding a Metropolis-Hastings step, one obtains a Markov chain Monte Carlo scheme whose stationary distribution is the exact smoothing distribution. This procedure expands upon a similar one recently proposed by Andrieu, Doucet, Holenstein, and Whiteley. We also show that simulating multiple trajectories from each realization of the particle filter can be beneficial from a perspective of variance versus computation time, and illustrate this idea using two examples.
Place, publisher, year, edition, pages
2011. Vol. 59, no 10, 4606-4619 p.
Computational efficiency, Monte Carlo methods, nonlinear filters, particle filters, state estimation
Electrical Engineering, Electronic Engineering, Information Engineering
IdentifiersURN: urn:nbn:se:kth:diva-52557DOI: 10.1109/TSP.2011.2161296ISI: 000297111500009ScopusID: 2-s2.0-80052890727OAI: oai:DiVA.org:kth-52557DiVA: diva2:467403
QC 201112192011-12-192011-12-192011-12-19Bibliographically approved