Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Bayesian model selection for change point detection and clustering
KTH, Skolan för elektroteknik och datavetenskap (EECS), Reglerteknik.
KTH, Skolan för elektroteknik och datavetenskap (EECS), Reglerteknik.ORCID-id: 0000-0003-0355-2663
KTH, Skolan för elektroteknik och datavetenskap (EECS), Nätverk och systemteknik. KTH, Skolan för elektroteknik och datavetenskap (EECS), Centra, ACCESS Linnaeus Centre.ORCID-id: 0000-0001-9810-3478
KTH, Skolan för elektroteknik och datavetenskap (EECS), Elkraftteknik.
2018 (engelsk)Inngår i: 35th International Conference on Machine Learning, ICML 2018, International Machine Learning Society (IMLS) , 2018, s. 5497-5520Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

We address a generalization of change point detection with the purpose of detecting the change locations and the levels of clusters of a piece- wise constant signal. Our approach is to model it as a nonparametric penalized least square model selection on a family of models indexed over the collection of partitions of the design points and propose a computationally efficient algorithm to approximately solve it. Statistically, minimizing such a penalized criterion yields an approximation to the maximum a-posteriori probability (MAP) estimator. The criterion is then ana-lyzed and an oracle inequality is derived using a Gaussian concentration inequality. The oracle inequality is used to derive on one hand conditions for consistency and on the other hand an adaptive upper bound on the expected square risk of the estimator, which statistically motivates our approximation. Finally, we apply our algorithm to simulated data to experimentally validate the statistical guarantees and illustrate its behavior.

sted, utgiver, år, opplag, sider
International Machine Learning Society (IMLS) , 2018. s. 5497-5520
Emneord [en]
Artificial intelligence, Bayesian networks, Least squares approximations, Probability distributions, Bayesian model selection, Change point detection, Computationally efficient, Concentration inequality, Maximum A posteriori probabilities, Penalized least-squares, Piece-wise constants, Statistical guarantee, Learning systems
HSV kategori
Identifikatorer
URN: urn:nbn:se:kth:diva-247470Scopus ID: 2-s2.0-85057285830ISBN: 9781510867963 (tryckt)OAI: oai:DiVA.org:kth-247470DiVA, id: diva2:1302686
Konferanse
35th International Conference on Machine Learning, ICML 2018, 10 July 2018 through 15 July 2018
Merknad

QC20190405

Tilgjengelig fra: 2019-04-05 Laget: 2019-04-05 Sist oppdatert: 2019-04-05bibliografisk kontrollert

Open Access i DiVA

Fulltekst mangler i DiVA

Scopus

Personposter BETA

Mazhar, OthmaneRojas, Cristian R.Fischione, CarloHesamzadeh, Mohammad Reza

Søk i DiVA

Av forfatter/redaktør
Mazhar, OthmaneRojas, Cristian R.Fischione, CarloHesamzadeh, Mohammad Reza
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar

isbn
urn-nbn

Altmetric

isbn
urn-nbn
Totalt: 143 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf