Abstract
Calibrating neural networks is crucial in applications where the decision making depends on the predicted probabilities. Modern neural networks can be poorly calibrated. They tend to overestimate probabilities when compared to the expected accuracy. This results in a misleading reliability that corrupts our decision policy. We show that the magnitude of calibration error depends on the predicted confidence for each sample. This prediction confidence calibration paradigm is then applied to the concept of temperature scaling. We describe an optimization method that finds the suitable temperature scaling for each bin of a discretized value of prediction confidence. We report extensive experiments on a variety of image datasets and network architectures. Our approach achieves state-of-the-art calibration with a guarantee that the classification accuracy is not altered.
Original language | English |
---|---|
Title of host publication | 30th European Signal Processing Conference, EUSIPCO 2022 - Proceedings |
Publisher | European Signal Processing Conference, EUSIPCO |
Pages | 1586-1590 |
Number of pages | 5 |
ISBN (Electronic) | 9789082797091 |
State | Published - 2022 |
Event | 30th European Signal Processing Conference, EUSIPCO 2022 - Belgrade, Serbia Duration: 29 Aug 2022 → 2 Sep 2022 |
Publication series
Name | European Signal Processing Conference |
---|---|
Volume | 2022-August |
ISSN (Print) | 2219-5491 |
Conference
Conference | 30th European Signal Processing Conference, EUSIPCO 2022 |
---|---|
Country/Territory | Serbia |
City | Belgrade |
Period | 29/08/22 → 2/09/22 |
Bibliographical note
Publisher Copyright:© 2022 European Signal Processing Conference, EUSIPCO. All rights reserved.
Funding
The research was partially supported by the Israeli Ministry of Science & Technology.
Funders | Funder number |
---|---|
Ministry of science and technology, Israel |
Keywords
- Expected Calibration Error (ECE)
- network calibration
- neural networks
- temperature scaling