Conformal Prediction of Classifiers with Many Classes based on Noisy Labels

Research output: Contribution to journalConference articlepeer-review

Abstract

Conformal Prediction (CP) controls the prediction uncertainty of classification systems by producing a small prediction set, ensuring a predetermined probability that the true class lies within this set. This is commonly done by defining a score, based on the model predictions, and setting a threshold on this score using a validation set. In this study, we address the problem of CP calibration when we only have access to a calibration set with noisy labels. We show how we can estimate the noise-free conformal threshold based on the noisy labeled data. We derive a finite sample coverage guarantee for uniform noise that remains effective even in tasks with a large number of classes. We dub our approach Noise-Aware Conformal Prediction (NACP). We illustrate the performance of the proposed results on several standard image classification datasets with a large number of classes.

Original languageEnglish
Pages (from-to)82-95
Number of pages14
JournalProceedings of Machine Learning Research
Volume266
StatePublished - 2025
Event14th Symposium on Conformal and Probabilistic Prediction with Applications, COPA 2025 - London, United Kingdom
Duration: 10 Sep 202512 Sep 2025

Bibliographical note

Publisher Copyright:
© 2025 C. Penso, J. Goldberger & E. Fetaya.

Fingerprint

Dive into the research topics of 'Conformal Prediction of Classifiers with Many Classes based on Noisy Labels'. Together they form a unique fingerprint.

Cite this