Abstract
Shannon's capacity and rate-distortion function, combined with the separation principle, provide tight bounds for the minimum possible distortion in joint source-channel coding. These bounds, however, are usually achievable only in the limit of a large block length. In their 1973 paper, Ziv and Zakai introduced a family of alternative capacity and rate-distortion functions, based on functionals satisfying the data-processing inequality, which potentially give tighter bounds for systems with a small block length. There is a considerable freedom as to how to choose those functionals, and the ways of finding the best possible functionals yielding the best bounds for a given source-channel combination are not specified. We examine recently conjectured high SNR asymptotic expressions for the Ziv-Zakai bounds, based on the Rényi-divergence functional. We derive nonasymptotic bounds on the Ziv-Zakai-Rényi rate-distortion function and capacity for a broad class of sources and additive noise channels, which hold for arbitrary SNR and prove the conjectured asymptotic expressions in the limit of a small distortion/high SNR. The results lead to new bounds on the best achievable distortion in finite dimensional joint source-channel coding. Examples are presented where the new bounds achieve significant improvement upon Shannon's original bounds.
Original language | English |
---|---|
Article number | 7124492 |
Pages (from-to) | 4293-4315 |
Number of pages | 23 |
Journal | IEEE Transactions on Information Theory |
Volume | 61 |
Issue number | 8 |
DOIs | |
State | Published - Aug 2015 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© 1963-2012 IEEE.
Keywords
- Joint source-channel coding
- Renyi divergence
- Ziv-Zakai
- finite blocklength