Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Inductive Inference and Partition Exchangeability in Classification
University of Helsinki .
University of Helsinki .
KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics. (Computational Biostatistics)ORCID iD: 0000-0003-1489-8512
2013 (English)In: Algorithmic Probability and Friends. Bayesian Prediction and Artificial Intelligence: Papers from the Ray Solomonoff 85th Memorial Conference. / [ed] Dowe, David L., Springer Berlin/Heidelberg, 2013, 91-105 p.Conference paper, Published paper (Refereed)
Abstract [en]

Inductive inference has been a subject of intensive research efforts over several decades. In particular, for classification problems substantial advances have been made and the field has matured into a wide range of powerful approaches to inductive inference. However, a considerable challenge arises when deriving principles for an inductive supervised classifier in the presence of unpredictable or unanticipated events corresponding to unknown alphabets of observable features. Bayesian inductive theories based on de Finetti type exchangeability which have become popular in supervised classification do not apply to such problems. Here we derive an inductive supervised classifier based on partition exchangeability due to John Kingman. It is proven that, in contrast to classifiers based on de Finetti type exchangeability which can optimally handle test items independently of each other in the presence of infinite amounts of training data, a classifier based on partition exchangeability still continues to benefit from a joint prediction of labels for the whole population of test items. Some remarks about the relation of this work to generic convergence results in predictive inference are also given.

Place, publisher, year, edition, pages
Springer Berlin/Heidelberg, 2013. 91-105 p.
Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 7070
Keyword [en]
Bayesian learning, classification, exchageabily, inductive inference
National Category
Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:kth:diva-137054Scopus ID: 2-s2.0-84893200464ISBN: 978-3-642-44957-4 (print)ISBN: 978-3-642-44958-1 (print)OAI: oai:DiVA.org:kth-137054DiVA: diva2:677782
Conference
Ray Solomonoff 85th Memorial Conference on Algorithmic Probability and Friends: Bayesian Prediction and Artificial Intelligence; Melbourne, VIC; Australia; 30 November 2011 through 2 December 2011
Funder
Swedish Research Council, 90583401
Note

QC 20140214

Available from: 2013-12-10 Created: 2013-12-10 Last updated: 2017-04-28Bibliographically approved

Open Access in DiVA

No full text

Scopus

Authority records BETA

Koski, Timo

Search in DiVA

By author/editor
Koski, Timo
By organisation
Mathematical Statistics
Probability Theory and Statistics

Search outside of DiVA

GoogleGoogle Scholar

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 31 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf