Summary form only given. Several new results on binary (d,k) codes are given. First a new derivation for the capacity of these codes is given. The derivation starts by considering the code as the concatenation of phrases, each phrase consisting of a set of zeros followed by a one. The phrases are of minimum length d plus 1 and of maximum length k plus 1. The information rate is derived using some information-theoretic inequalities. It is proven that the information rate is maximum when the phrases are statistically independent and chosen in accordance with a specific distribution. On the basis of this result, the spectrum of a (d,k) code is computed. The problem of computing the capacity of the binary symmetric channel under the constraint that the input sequences satisfy the (d,k) constraint is considered, and lower bounds on the capacity of such a channel are derived.
|Number of pages
|Published - 1986