Evaluating and Calibrating Uncertainty Prediction in Regression Tasks

Dan Levi, Liran Gispan, Niv Giladi, Ethan Fetaya

Research output: Contribution to journalArticlepeer-review

42 Scopus citations

Abstract

Predicting not only the target but also an accurate measure of uncertainty is important for many machine learning applications, and in particular, safety-critical ones. In this work, we study the calibration of uncertainty prediction for regression tasks which often arise in real-world systems. We show that the existing definition for the calibration of regression uncertainty has severe limitations in distinguishing informative from non-informative uncertainty predictions. We propose a new definition that escapes this caveat and an evaluation method using a simple histogram-based approach. Our method clusters examples with similar uncertainty prediction and compares the prediction with the empirical uncertainty on these examples. We also propose a simple, scaling-based calibration method that preforms as well as much more complex ones. We show results on both a synthetic, controlled problem and on the object detection bounding-box regression task using the COCO and KITTI datasets.

Original languageEnglish
Article number5540
JournalSensors
Volume22
Issue number15
DOIs
StatePublished - 25 Jul 2022

Bibliographical note

Publisher Copyright:
© 2022 by the authors.

Keywords

  • prediction uncertainty
  • regression

Fingerprint

Dive into the research topics of 'Evaluating and Calibrating Uncertainty Prediction in Regression Tasks'. Together they form a unique fingerprint.

Cite this