kth.sePublications
Change search
Link to record
Permanent link

Direct link
Werner, Hugo
Publications (2 of 2) Show all publications
Werner, H., Carlsson, L., Ahlberg, E. & Boström, H. (2021). Evaluation of Updating Strategies for Conformal Predictive Systems in the Presence of Extreme Events. In: Proceedings of the 10th Symposium on Conformal and Probabilistic Prediction and Applications, COPA 2021: . Paper presented at 10th Symposium on Conformal and Probabilistic Prediction and Applications, COPA 2021, Virtual, Online, NA, Sep 8 2021 - Sep 10 2021 (pp. 229-242). ML Research Press
Open this publication in new window or tab >>Evaluation of Updating Strategies for Conformal Predictive Systems in the Presence of Extreme Events
2021 (English)In: Proceedings of the 10th Symposium on Conformal and Probabilistic Prediction and Applications, COPA 2021, ML Research Press , 2021, p. 229-242Conference paper, Published paper (Refereed)
Abstract [en]

Six different strategies for updating split conformal predictive systems in an online (streaming) setting are evaluated. The updating strategies vary in the extent and frequency of retraining as well as in how training data is split into proper training and calibration sets. An empirical evaluation is presented, considering passenger booking data from a ferry company, which stretches over a number of years. The passenger volumes have changed drastically during 2020 due to COVID-19 and part of the evaluation is focusing on which updating strategies work best under such circumstances. Some strategies are observed to outperform others with respect to continuous ranked probability score and validity, highlighting the potential value of choosing a proper strategy.

Place, publisher, year, edition, pages
ML Research Press, 2021
Keywords
concept drift, Conformal predictive distributions, Split conformal predictive systems
National Category
Probability Theory and Statistics
Identifiers
urn:nbn:se:kth:diva-338641 (URN)2-s2.0-85160471134 (Scopus ID)
Conference
10th Symposium on Conformal and Probabilistic Prediction and Applications, COPA 2021, Virtual, Online, NA, Sep 8 2021 - Sep 10 2021
Note

QC 20231024

Available from: 2023-11-15 Created: 2023-11-15 Last updated: 2023-11-15Bibliographically approved
Werner, H., Carlsson, L., Ahlberg, E. & Boström, H. (2020). Evaluating Different Approaches to Calibrating Conformal Predictive Systems. In: Proceedings of the 9th Symposium on Conformal and Probabilistic Prediction and Applications, COPA 2020: . Paper presented at 9th Symposium on Conformal and Probabilistic Predictions with Applications, COPA 2020, Virtual, Online, Italy, Sep 9 2020 - Sep 11 2020 (pp. 134-150). ML Research Press
Open this publication in new window or tab >>Evaluating Different Approaches to Calibrating Conformal Predictive Systems
2020 (English)In: Proceedings of the 9th Symposium on Conformal and Probabilistic Prediction and Applications, COPA 2020, ML Research Press , 2020, p. 134-150Conference paper, Published paper (Refereed)
Abstract [en]

Conformal predictive systems (CPSs) provide probability distributions for real-valued labels of test examples, rather than point predictions (as output by regular regression models) or confidence intervals (as output by conformal regressors). The performance of a CPS is dependent on both the underlying model and the way in which the quality of its predictions is estimated; a stronger underlying model and a better quality estimation can significantly improve the performance. Recent studies have shown that conformal regressors that use random forests as the underlying model may benefit from using out-of-bag predictions for the calibration, rather than setting aside a separate calibration set, allowing for more data to be used for training and thereby improving the performance of the underlying model. These studies have furthermore shown that the quality of the individual predictions can be effectively estimated using the variance of the predictions or by k-nearest-neighbor models trained on the prediction errors. It is here investigated whether these methods are also effective in the context of split conformal predictive systems. Results from a large empirical study are presented, using 33 publicly available datasets. The results show that by using either variance or the k-nearest-neighbor method for estimating prediction quality, a significant increase in performance, as measured by the continuous ranked probability score, can be obtained compared to omitting the quality estimation. The results furthermore show that the use of out-of-bag examples for calibration is competitive with the most effective way of splitting training data into a proper training set and a calibration set, without requiring tuning of the calibration set size.

Place, publisher, year, edition, pages
ML Research Press, 2020
Keywords
Conformal predictive distributions, Quality estimation, Random forest, Regression, Split conformal predictive systems
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:kth:diva-350335 (URN)2-s2.0-85126598294 (Scopus ID)
Conference
9th Symposium on Conformal and Probabilistic Predictions with Applications, COPA 2020, Virtual, Online, Italy, Sep 9 2020 - Sep 11 2020
Note

QC 20240711

Available from: 2024-07-11 Created: 2024-07-11 Last updated: 2024-07-11Bibliographically approved
Organisations

Search in DiVA

Show all publications