Consider the conditional entropy and mutual information for the binary symmetric channel. The input source has alphabet $X=\{0,1\}$ and associated probabilities ${\dfrac{1}{2}, \dfrac{1}{2}}$. The channel matrix is $\begin{pmatrix} 1-p & p \\ p & 1-p \end{pmatrix}$ wgere p is the transition probability. Then the conditional entropy is given by:
- $1$
- $-p \log(p)-(1-p) \log(1-p)$
- $1+p \log(p)+(1-p) \log(1-p)$
- $0$