Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Bayesian learning of weakly structural Markov graph laws using sequential Monte Carlo methods
KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.ORCID iD: 0000-0003-0772-846X
KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.ORCID iD: 0000-0002-0633-5579
KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.ORCID iD: 0000-0002-6886-5436
2019 (English)In: Electronic Journal of Statistics, ISSN 1935-7524, E-ISSN 1935-7524, Vol. 13, no 2, p. 2865-2897Article in journal (Refereed) Published
Abstract [en]

We present a sequential sampling methodology for weakly structural Markov laws, arising naturally in a Bayesian structure learning context for decomposable graphical models. As a key component of our suggested approach, we show that the problem of graph estimation, which in general lacks natural sequential interpretation, can be recast into a sequential setting by proposing a recursive Feynman-Kac model that generates a flow of junction tree distributions over a space of increasing dimensions. We focus on particle McMC methods to provide samples on this space, in particular on particle Gibbs (PG), as it allows for generating McMC chains with global moves on an underlying space of decomposable graphs. To further improve the PG mixing properties, we incorporate a systematic refreshment step implemented through direct sampling from a backward kernel. The theoretical properties of the algorithm are investigated, showing that the proposed refreshment step improves the performance in terms of asymptotic variance of the estimated distribution. The suggested sampling methodology is illustrated through a collection of numerical examples demonstrating high accuracy in Bayesian graph structure learning in both discrete and continuous graphical models.

Place, publisher, year, edition, pages
Institute of Mathematical Statistics , 2019. Vol. 13, no 2, p. 2865-2897
Keywords [en]
Decomposable graphical models, Particle gibbs, Sequential sampling, Structure learning
National Category
Mathematics
Identifiers
URN: urn:nbn:se:kth:diva-268610DOI: 10.1214/19-EJS1585ISI: 000505695800015Scopus ID: 2-s2.0-85073362032OAI: oai:DiVA.org:kth-268610DiVA, id: diva2:1428233
Funder
Swedish Research Council, C0595201, 2018-05230
Note

QC 20200505

Available from: 2020-05-05 Created: 2020-05-05 Last updated: 2020-05-05Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records BETA

Olsson, JimmyPavlenko, TatjanaRios, Felix Leopoldo

Search in DiVA

By author/editor
Olsson, JimmyPavlenko, TatjanaRios, Felix Leopoldo
By organisation
Mathematical Statistics
In the same journal
Electronic Journal of Statistics
Mathematics

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 1 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf