Abstract
The information-bottleneck (IB) principle is defined in terms of mutual information. This study defines mutual information between two random variables using the Jensen-Shannon (JS) divergence instead of the standard definition which is based on the Kullback-Leibler (KL) divergence. We reformulate the information-bottleneck principle using the proposed mutual information and apply it to the problem of pairwise clustering. We show that applying IB to clustering tasks using JS divergences instead of KL yields improved results. This indicates that JS-based mutual information has an expressive power at least as the standard KL-based mutual information.
Original language | English |
---|---|
Title of host publication | 2019 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2019 - Proceedings |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 3507-3511 |
Number of pages | 5 |
ISBN (Electronic) | 9781479981311 |
DOIs | |
State | Published - May 2019 |
Event | 44th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2019 - Brighton, United Kingdom Duration: 12 May 2019 → 17 May 2019 |
Publication series
Name | ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings |
---|---|
Volume | 2019-May |
ISSN (Print) | 1520-6149 |
Conference
Conference | 44th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2019 |
---|---|
Country/Territory | United Kingdom |
City | Brighton |
Period | 12/05/19 → 17/05/19 |
Bibliographical note
Publisher Copyright:© 2019 IEEE.
Keywords
- Jensen-Shannon (JS) divergence
- information bottleneck
- pairwise clustering