976 views

Consider the conditional entropy and mutual information for the binary symmetric channel. The input source has alphabet $X=\{0,1\}$ and associated probabilities ${\frac{1}{2}, \frac{1}{2}}$. The channel matrix is $\begin{pmatrix} 1-p & p \\ p & 1-p \end{pmatrix}$ wgere p is the transition probability. Then the conditional entropy is given by:

1. 1
2. -plog(p)-(1-p)log(1-p)
3. 1+plog(p)+(1-p)log(1-p)
4. 0

recategorized | 976 views

by (75 points)
0

Ans  is B:  -plog(p)-(1-p)log(1-p)

+1 vote
1
+1 vote