Abstract
This paper proposes a dissimilarity measure between two Gaussian mixture models (GMM). Computing a distance measure between two GMMs that were learned from speech segments is a key element in speaker verification, speaker segmentation and many other related applications. A natural measure between two distributions is the Kullback-Leibler divergence. However, it cannot be analytically computed in the case of GMM. We propose an accurate and efficiently computed approximation of the KL-divergence. The method is based on the unscented transform which is usually used to obtain a better alternative to the extended Kalman filter. The suggested distance is evaluated in an experimental setup of speakers data-set. The experimental results indicate that our proposed approximations outperform previously suggested methods.
Original language | English |
---|---|
Pages | 1985-1988 |
Number of pages | 4 |
State | Published - 2005 |
Event | 9th European Conference on Speech Communication and Technology - Lisbon, Portugal Duration: 4 Sep 2005 → 8 Sep 2005 |
Conference
Conference | 9th European Conference on Speech Communication and Technology |
---|---|
Country/Territory | Portugal |
City | Lisbon |
Period | 4/09/05 → 8/09/05 |