Abstract
It is well known that modern neural networks are poorly calibrated. They tend to overestimate or underestimate probabilities when compared to the expected accuracy. This results in misleading reliability and corrupting our decision policy. We show that the amount of calibration error differs across the classes. As a result, we propose to calibrate each class separately. We apply this class-level calibration paradigm to the concept of temperature scaling and describe an optimization method that finds the suitable temperature scaling for each class. We report extensive experiments on a variety of image datasets, and a wide variety of network architectures, and show that our approach achieves state-of-the-art calibration without compromising on accuracy in almost all cases.
Original language | English |
---|---|
Title of host publication | 29th European Signal Processing Conference, EUSIPCO 2021 - Proceedings |
Publisher | European Signal Processing Conference, EUSIPCO |
Pages | 1486-1490 |
Number of pages | 5 |
ISBN (Electronic) | 9789082797060 |
DOIs | |
State | Published - 2021 |
Event | 29th European Signal Processing Conference, EUSIPCO 2021 - Dublin, Ireland Duration: 23 Aug 2021 → 27 Aug 2021 |
Publication series
Name | European Signal Processing Conference |
---|---|
Volume | 2021-August |
ISSN (Print) | 2219-5491 |
Conference
Conference | 29th European Signal Processing Conference, EUSIPCO 2021 |
---|---|
Country/Territory | Ireland |
City | Dublin |
Period | 23/08/21 → 27/08/21 |
Bibliographical note
Publisher Copyright:© 2021 European Signal Processing Conference. All rights reserved.
Funding
The research was partially supported by the Israeli Ministry of Science & Technology.
Funders | Funder number |
---|---|
Ministry of science and technology, Israel |
Keywords
- Expected calibration error (ECE)
- Network calibration
- Neural networks
- Temperature scaling