Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Bayesian model selection for change point detection and clustering
KTH, School of Electrical Engineering and Computer Science (EECS), Automatic Control.
KTH, School of Electrical Engineering and Computer Science (EECS), Automatic Control.ORCID iD: 0000-0003-0355-2663
KTH, School of Electrical Engineering and Computer Science (EECS), Network and Systems engineering. KTH, School of Electrical Engineering and Computer Science (EECS), Centres, ACCESS Linnaeus Centre.ORCID iD: 0000-0001-9810-3478
KTH, School of Electrical Engineering and Computer Science (EECS), Electric Power and Energy Systems.
2018 (English)In: 35th International Conference on Machine Learning, ICML 2018, International Machine Learning Society (IMLS) , 2018, p. 5497-5520Conference paper, Published paper (Refereed)
Abstract [en]

We address a generalization of change point detection with the purpose of detecting the change locations and the levels of clusters of a piece- wise constant signal. Our approach is to model it as a nonparametric penalized least square model selection on a family of models indexed over the collection of partitions of the design points and propose a computationally efficient algorithm to approximately solve it. Statistically, minimizing such a penalized criterion yields an approximation to the maximum a-posteriori probability (MAP) estimator. The criterion is then ana-lyzed and an oracle inequality is derived using a Gaussian concentration inequality. The oracle inequality is used to derive on one hand conditions for consistency and on the other hand an adaptive upper bound on the expected square risk of the estimator, which statistically motivates our approximation. Finally, we apply our algorithm to simulated data to experimentally validate the statistical guarantees and illustrate its behavior.

Place, publisher, year, edition, pages
International Machine Learning Society (IMLS) , 2018. p. 5497-5520
Keywords [en]
Artificial intelligence, Bayesian networks, Least squares approximations, Probability distributions, Bayesian model selection, Change point detection, Computationally efficient, Concentration inequality, Maximum A posteriori probabilities, Penalized least-squares, Piece-wise constants, Statistical guarantee, Learning systems
National Category
Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:kth:diva-247470Scopus ID: 2-s2.0-85057285830ISBN: 9781510867963 (print)OAI: oai:DiVA.org:kth-247470DiVA, id: diva2:1302686
Conference
35th International Conference on Machine Learning, ICML 2018, 10 July 2018 through 15 July 2018
Note

QC20190405

Available from: 2019-04-05 Created: 2019-04-05 Last updated: 2019-04-05Bibliographically approved

Open Access in DiVA

No full text in DiVA

Scopus

Authority records BETA

Mazhar, OthmaneRojas, Cristian R.Fischione, CarloHesamzadeh, Mohammad Reza

Search in DiVA

By author/editor
Mazhar, OthmaneRojas, Cristian R.Fischione, CarloHesamzadeh, Mohammad Reza
By organisation
Automatic ControlNetwork and Systems engineeringACCESS Linnaeus CentreElectric Power and Energy Systems
Probability Theory and Statistics

Search outside of DiVA

GoogleGoogle Scholar

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 131 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf