Skip to content

Latest commit

 

History

History
46 lines (46 loc) · 1.86 KB

2023-08-17-smirnov23a.md

File metadata and controls

46 lines (46 loc) · 1.86 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Coverage vs Acceptance-Error Curves for Conformal Classification Models
In this paper, we introduce coverage vs acceptance-error graphs as a visualization tool for comparing the performance of conformal predictors at a given significance level $\epsilon$ for any k-class classification task with k $\geq$ 2. We show that by plotting the performance of each predictor for different significance levels in $\epsilon$ $\in$ [0, 1], we receive a coverage vs acceptanceerror curve for that predictor. The area under this curve represents the probability that the p-value of randomly chosen true class-label of any test instance is greater than the p-value of any other false class-label for the same or any other test instance. This area can be used as a metric for predictive efficiency of a conformal predictor, when the validity has been established. The new metric is unique in that it is related to the empirical coverage rate, and extensive experiments confirmed its utility and difference from existing predictive efficiency criteria.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
smirnov23a
0
Coverage vs Acceptance-Error Curves for Conformal Classification Models
534
545
534-545
534
false
Smirnov, Evgueni
given family
Evgueni
Smirnov
2023-08-17
Proceedings of the Twelfth Symposium on Conformal and Probabilistic Prediction with Applications
204
inproceedings
date-parts
2023
8
17