kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Large deviations for weighted empirical measures and processes arising in importance sampling
KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.ORCID iD: 0000-0001-8702-2293
2013 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

This thesis consists of two papers related to large deviation results associated with importance sampling algorithms. As the need for efficient computational methods increases, so does the need for theoretical analysis of simulation algorithms. This thesis is mainly concerned with algorithms using importance sampling. Both papers make theoretical contributions to the development of a new approach for analyzing efficiency of importance sampling algorithms by means of large deviation theory.

In the first paper of the thesis, the efficiency of an importance sampling algorithm is studied using a large deviation result for the sequence of weighted empirical measures that represent the output of the algorithm. The main result is stated in terms of the Laplace principle for the weighted empirical measure arising in importance sampling and it can be viewed as a weighted version of Sanov's theorem. This result is used to quantify the performance of an importance sampling algorithm over a collection of subsets of a given target set as well as quantile estimates. The method of proof is the weak convergence approach to large deviations developed by Dupuis and Ellis.

The second paper studies moderate deviations of the empirical process analogue of the weighted empirical measure arising in importance sampling. Using moderate deviation results for empirical processes the moderate deviation principle is proved for weighted empirical processes that arise in importance sampling. This result can be thought of as the empirical process analogue of the main result of the first paper and the proof is established using standard techniques for empirical processes and Banach space valued random variables. The moderate deviation principle for the importance sampling estimator of the tail of a distribution follows as a corollary. From this, moderate deviation results are established for importance sampling estimators of two risk measures: The quantile process and Expected Shortfall. The results are proved using a delta method for large deviations established by Gao and Zhao (2011) together with more classical results from the theory of large deviations.

The thesis begins with an informal discussion of stochastic simulation, in particular importance sampling, followed by short mathematical introductions to large deviations and importance sampling.

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2013. , p. iv, 20
Series
Trita-MAT, ISSN 1401-2286 ; 13:01
National Category
Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:kth:diva-117810ISBN: 978-91-7501-644-3 (print)OAI: oai:DiVA.org:kth-117810DiVA, id: diva2:603126
Presentation
2013-02-25, 3721, Lindstedtsvägen 25, KTH, Stockholm, 15:15 (English)
Opponent
Supervisors
Note

QC 20130205

Available from: 2013-02-05 Created: 2013-02-05 Last updated: 2022-06-24Bibliographically approved
List of papers
1. Moderate deviation principles for importance sampling estimators of risk measures
Open this publication in new window or tab >>Moderate deviation principles for importance sampling estimators of risk measures
2017 (English)In: Journal of Applied Probability, ISSN 0021-9002, E-ISSN 1475-6072Article in journal (Refereed) Accepted
Abstract [en]

Importance sampling has become an important tool for the computation of tail-based risk measures. Since such quantities are often determined mainly by rare events standard Monte Carlo can be inefficient and importance sampling provides a way to speed up computations. This paper considers moderate deviations for the weighted empirical process, the process analogue of the weighted empirical measure, arising in importance sampling. The moderate deviation principle is established as an extension of existing results. Using a delta method for large deviations established by Gao and Zhao (Ann. Statist., 2011) together with classical large deviation techniques, the moderate deviation principle for the weighted empirical process is extended to functionals of the weighted empirical process which correspond to risk measures. The main results are moderate deviation principles for importance sampling estimators of the quantile function of a distribution and Expected Shortfall.

Keywords
Large deviations, moderate deviations, empirical processes, importance sampling, risk measures
National Category
Probability Theory and Statistics
Identifiers
urn:nbn:se:kth:diva-117808 (URN)
Note

QCR 20161219

Available from: 2013-02-05 Created: 2013-02-05 Last updated: 2024-03-15Bibliographically approved
2. Large deviations for weighted empirical measures arising in importance sampling
Open this publication in new window or tab >>Large deviations for weighted empirical measures arising in importance sampling
2016 (English)In: Stochastic Processes and their Applications, ISSN 0304-4149, E-ISSN 1879-209X, Vol. 126, no 1Article in journal (Refereed) Published
Abstract [en]

Importance sampling is a popular method for efficient computation of various properties of a distribution such as probabilities, expectations, quantiles etc. The output of an importance sampling algorithm can be represented as a weighted empirical measure, where the weights are given by the likelihood ratio between the original distribution and the sampling distribution. In this paper the efficiency of an importance sampling algorithm is studied by means of large deviations for the weighted empirical measure. The main result, which is stated as a Laplace principle for the weighted empirical measure arising in importance sampling, can be viewed as a weighted version of Sanov's theorem. The main theorem is applied to quantify the performance of an importance sampling algorithm over a collection of subsets of a given target set as well as quantile estimates. The proof of the main theorem relies on the weak convergence approach to large deviations developed by Dupuis and Ellis.

Place, publisher, year, edition, pages
Elsevier, 2016
National Category
Probability Theory and Statistics
Identifiers
urn:nbn:se:kth:diva-117805 (URN)10.1016/j.spa.2015.08.002 (DOI)000366535500006 ()2-s2.0-84948440031 (Scopus ID)
Note

QC 20160115

Available from: 2013-02-05 Created: 2013-02-05 Last updated: 2024-03-15Bibliographically approved

Open Access in DiVA

fulltext(201 kB)990 downloads
File information
File name FULLTEXT01.pdfFile size 201 kBChecksum SHA-512
d7d07cd3bb01ea6413f52d0e1e99ccbe4738e70ff03733c7300c574d79b61e5790e607c925deac3138865af18a96a0144f7e94d9b5f86fb3c7392c8faed8772e
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Nyquist, Pierre
By organisation
Mathematical Statistics
Probability Theory and Statistics

Search outside of DiVA

GoogleGoogle Scholar
Total: 994 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 759 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf