Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Regression conformal prediction with random forests
Högskolan i Borås, Institutionen Handels- och IT-högskolan.
Högskolan i Borås, Institutionen Handels- och IT-högskolan, Sweden.
Högskolan i Borås, Institutionen Handels- och IT-högskolan.
Högskolan i Borås, Institutionen Handels- och IT-högskolan.
2014 (English)In: Machine Learning, ISSN 0885-6125, E-ISSN 1573-0565, Vol. 97, no 1-2, p. 155-176Article in journal (Refereed) Published
Abstract [en]

Regression conformal prediction produces prediction intervals that are valid, i.e., the probability of excluding the correct target value is bounded by a predefined confidence level. The most important criterion when comparing conformal regressors is efficiency; the prediction intervals should be as tight (informative) as possible. In this study, the use of random forests as the underlying model for regression conformal prediction is investigated and compared to existing state-of-the-art techniques, which are based on neural networks and k-nearest neighbors. In addition to their robust predictive performance, random forests allow for determining the size of the prediction intervals by using out-of-bag estimates instead of requiring a separate calibration set. An extensive empirical investigation, using 33 publicly available data sets, was undertaken to compare the use of random forests to existing stateof- the-art conformal predictors. The results show that the suggested approach, on almost all confidence levels and using both standard and normalized nonconformity functions, produced significantly more efficient conformal predictors than the existing alternatives.

Place, publisher, year, edition, pages
Springer-Verlag New York, 2014. Vol. 97, no 1-2, p. 155-176
Keywords [en]
Conformal prediction, Random forests, Regression, Machine learning, Data mining
National Category
Computer Sciences Computer and Information Sciences
Identifiers
URN: urn:nbn:se:kth:diva-221527DOI: 10.1007/s10994-014-5453-0Scopus ID: 2-s2.0-84906946396OAI: oai:DiVA.org:kth-221527DiVA, id: diva2:1175274
Funder
Knowledge Foundation, 20120192Swedish Foundation for Strategic Research , IIS11-0053
Note

Sponsorship:

This

work was supported by the Swedish Foundation for Strategic Research through the project High-Performance

Data Mining for Drug Effect Detection (IIS11-0053) and the Knowledge Foundation through the project Big

Data Analytics by Online Ensemble Learning (20120192).

QC 20180119

Available from: 2018-01-17 Created: 2018-01-17 Last updated: 2018-01-19Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Boström, Henrik
In the same journal
Machine Learning
Computer SciencesComputer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 89 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf