On the Entropy of a Noisy Function

Alex Samorodnitsky

Research output: Contribution to journalArticlepeer-review

22 Scopus citations

Abstract

Let 0 < ϵ < 1/2 be a noise parameter, and let Tϵ be the noise operator acting on functions on the Boolean cube {0,1}n. Let f be a nonnegative function on {0,1}n. We upper bound the entropy of Tϵ f by the average entropy of conditional expectations of f, given sets of roughly (1-2 ϵ )2. n variables. In information-theoretic terms, we prove the following strengthening of Mrs. Gerber's Lemma: let X be a random binary vector of length n, and let Z be a noise vector, corresponding to a binary symmetric channel with crossover probability ϵ . Then, setting v = (1-2 ϵ)2. n, we have (up to lower order terms): H (X ⊕ Z) ≥ n H2 ( ϵ + (1-2 ϵ) H2-1 (E|B| = v H (Xi{i ∈ B)/v). Assuming ϵ ≥ 1/2-δ, for some absolute constant δ > 0, this inequality, combined with a strong version of a theorem of Friedgut et al., due to Jendrej et al., shows that if a Boolean function f is close to a characteristic function g of a subcube of dimension n-1, then the entropy of Tϵ f is at most that of Tϵ g. Taken together with a recent result of Ordentlich et al., this shows that the most informative Boolean function conjecture of Courtade and Kumar holds for high noise ϵ ≥ 1/2-δ. Namely, if X is uniformly distributed in {0,1n} and Y is obtained by flipping each coordinate of X independently with probability ϵ, then, provided ϵ ≥ 1/2-δ, for any Boolean function f holds I (f(X);Y ) ≤ 1-H(ϵ).

Original languageEnglish
Article number7498615
Pages (from-to)5446-5464
Number of pages19
JournalIEEE Transactions on Information Theory
Volume62
Issue number10
DOIs
StatePublished - Oct 2016
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2016 IEEE.

Keywords

  • Boolean functions
  • extremal inequality
  • mutual information

Fingerprint

Dive into the research topics of 'On the Entropy of a Noisy Function'. Together they form a unique fingerprint.

Cite this