background
logo
ArxivPaperAI

Lower Bounds on Mutual Information for Linear Codes Transmitted over Binary Input Channels, and for Information Combining

Author:
Uri Erez, Or Ordentlich, Shlomo Shamai
Keyword:
Computer Science, Information Theory, Information Theory (cs.IT)
journal:
--
date:
2024-01-26 00:00:00
Abstract
It has been known for a long time that the mutual information between the input sequence and output of a binary symmetric channel (BSC) is upper bounded by the mutual information between the same input sequence and the output of a binary erasure channel (BEC) with the same capacity. Recently, Samorodintsky discovered that one may also lower bound the BSC mutual information in terms of the mutual information between the same input sequence and a more capable BEC. In this paper, we strengthen Samordnitsky's bound for the special case where the input to the channel is distributed uniformly over a linear code. Furthermore, for a general (not necessarily binary) input distribution $P_X$ and channel $W_{Y|X}$, we derive a new lower bound on the mutual information $I(X;Y^n)$ for $n$ transmissions of $X\sim P_X$ through the channel $W_{Y|X}$.
PDF: Lower Bounds on Mutual Information for Linear Codes Transmitted over Binary Input Channels, and for Information Combining.pdf
Empowered by ChatGPT