BR-SNIS: Bias Reduced Self-Normalized Importance SamplingShow others and affiliations
2022 (English)In: Advances in Neural Information Processing Systems 35 - 36th Conference on Neural Information Processing Systems, NeurIPS 2022, Neural Information Processing Systems Foundation , 2022Conference paper, Published paper (Refereed)
Abstract [en]
Importance Sampling (IS) is a method for approximating expectations under a target distribution using independent samples from a proposal distribution and the associated importance weights. In many applications, the target distribution is known only up to a normalization constant, in which case self-normalized IS (SNIS) can be used. While the use of self-normalization can have a positive effect on the dispersion of the estimator, it introduces bias. In this work, we propose a new method, BR-SNIS, whose complexity is essentially the same as that of SNIS and which significantly reduces bias without increasing the variance. This method is a wrapper in the sense that it uses the same proposal samples and importance weights as SNIS, but makes clever use of iterated sampling-importance resampling (i-SIR) to form a bias-reduced version of the estimator. We furnish the proposed algorithm with rigorous theoretical results, including new bias, variance and high-probability bounds, and these are illustrated by numerical examples.
Place, publisher, year, edition, pages
Neural Information Processing Systems Foundation , 2022.
National Category
Probability Theory and Statistics Neurosciences
Identifiers
URN: urn:nbn:se:kth:diva-335770Scopus ID: 2-s2.0-85142392511OAI: oai:DiVA.org:kth-335770DiVA, id: diva2:1795509
Conference
36th Conference on Neural Information Processing Systems, NeurIPS 2022, New Orleans, United States of America, Nov 28 2022 - Dec 9 2022
Note
Part of ISBN 9781713871088
QC 20230908
2023-09-082023-09-082023-09-08Bibliographically approved