kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
BR-SNIS: Bias Reduced Self-Normalized Importance Sampling
Centre de Mathématiques appliquées, Ecole polytechnique, IHU Liryc, fondation Bordeaux Université, Univ Bordeaux, CRCTB U4045, INSERM, Centre de Mathématiques Appliquées, Ecole polytechnique, IHU Liryc, Fondation Bordeaux Université, Univ Bordeaux, CRCTB U4045, INSERM.
HSE University, HSE University.
UMR MIA, AgroParisTech, UMR MIA, AgroParisTech.
Centre de Mathématiques appliquées, Ecole polytechnique, Centre de Mathématiques Appliquées, Ecole Polytechnique.
Show others and affiliations
2022 (English)In: Advances in Neural Information Processing Systems 35 - 36th Conference on Neural Information Processing Systems, NeurIPS 2022, Neural Information Processing Systems Foundation , 2022Conference paper, Published paper (Refereed)
Abstract [en]

Importance Sampling (IS) is a method for approximating expectations under a target distribution using independent samples from a proposal distribution and the associated importance weights. In many applications, the target distribution is known only up to a normalization constant, in which case self-normalized IS (SNIS) can be used. While the use of self-normalization can have a positive effect on the dispersion of the estimator, it introduces bias. In this work, we propose a new method, BR-SNIS, whose complexity is essentially the same as that of SNIS and which significantly reduces bias without increasing the variance. This method is a wrapper in the sense that it uses the same proposal samples and importance weights as SNIS, but makes clever use of iterated sampling-importance resampling (i-SIR) to form a bias-reduced version of the estimator. We furnish the proposed algorithm with rigorous theoretical results, including new bias, variance and high-probability bounds, and these are illustrated by numerical examples.

Place, publisher, year, edition, pages
Neural Information Processing Systems Foundation , 2022.
National Category
Probability Theory and Statistics Neurosciences
Identifiers
URN: urn:nbn:se:kth:diva-335770Scopus ID: 2-s2.0-85142392511OAI: oai:DiVA.org:kth-335770DiVA, id: diva2:1795509
Conference
36th Conference on Neural Information Processing Systems, NeurIPS 2022, New Orleans, United States of America, Nov 28 2022 - Dec 9 2022
Note

Part of ISBN 9781713871088

QC 20230908

Available from: 2023-09-08 Created: 2023-09-08 Last updated: 2023-09-08Bibliographically approved

Open Access in DiVA

No full text in DiVA

Scopus

Authority records

Olsson, Jimmy

Search in DiVA

By author/editor
Olsson, Jimmy
By organisation
Mathematical Statistics
Probability Theory and StatisticsNeurosciences

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 97 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf