An efficient similarity measure based on approximations of KL-divergence between two Gaussian mixtures

J. Goldberger, Shiri Gordon, Hayit Greenspan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We present two new methods for approximating the Kullback-Liebler (KL) divergence between two mixtures of Gaussians. The first method is based on matching between the Gaussian elements of the two Gaussian mixture densities. The second method is based on the unscented transform. The proposed methods are utilized for image retrieval tasks. Continuous probabilistic image modeling based on mixtures of Gaussians together with KL measure for image similarity, can be used for image retrieval tasks with remarkable performance. The efficiency and the performance of the KL approximation methods proposed are demonstrated on both simulated data and real image data sets. The experimental results indicate that our proposed approximations outperform previously suggested methods.
Original languageAmerican English
Title of host publicationComputer Vision, 2003. Proceedings. Ninth IEEE International Conference on
PublisherIEEE
StatePublished - 2003

Bibliographical note

Place of conference:France

Fingerprint

Dive into the research topics of 'An efficient similarity measure based on approximations of KL-divergence between two Gaussian mixtures'. Together they form a unique fingerprint.

Cite this