Convolutional codes with unequal information protection are investigated. Lower bounds on the free distance of time-varying codes are derived and compared to previous bounds. The asymptotic behavior of these bounds leads to the conclusion that significant gains for the important data are attainable by enlarging the corresponding constraint length. This comes at the cost of reduced performance for the less significant data.
|Number of pages||1|
|Journal||IEEE International Symposium on Information Theory - Proceedings|
|State||Published - 2000|