Abstract
Conformal Prediction (CP) controls the prediction uncertainty of classification systems by producing a small prediction set, ensuring a predetermined probability that the true class lies within this set. This is commonly done by defining a score, based on the model predictions, and setting a threshold on this score using a validation set. In this study, we address the problem of CP calibration when we only have access to a calibration set with noisy labels. We show how we can estimate the noise-free conformal threshold based on the noisy labeled data. We derive a finite sample coverage guarantee for uniform noise that remains effective even in tasks with a large number of classes. We dub our approach Noise-Aware Conformal Prediction (NACP). We illustrate the performance of the proposed results on several standard image classification datasets with a large number of classes.
| Original language | English |
|---|---|
| Pages (from-to) | 82-95 |
| Number of pages | 14 |
| Journal | Proceedings of Machine Learning Research |
| Volume | 266 |
| State | Published - 2025 |
| Event | 14th Symposium on Conformal and Probabilistic Prediction with Applications, COPA 2025 - London, United Kingdom Duration: 10 Sep 2025 → 12 Sep 2025 |
Bibliographical note
Publisher Copyright:© 2025 C. Penso, J. Goldberger & E. Fetaya.