edited by
2,111 views
2 votes
2 votes

Consider the conditional entropy and mutual information for the binary symmetric channel. The input source has alphabet $X=\{0,1\}$ and associated probabilities ${\dfrac{1}{2}, \dfrac{1}{2}}$. The channel matrix is $\begin{pmatrix} 1-p & p \\ p & 1-p \end{pmatrix}$ wgere p is the transition probability. Then the conditional entropy is given by:

  1. $1$
  2. $-p \log(p)-(1-p) \log(1-p)$
  3. $1+p \log(p)+(1-p) \log(1-p)$
  4. $0$
edited by

1 Answer

Answer:

Related questions

1 votes
1 votes
1 answer
2
go_editor asked Aug 11, 2016
1,332 views
Blind image disconvolution isCombination of blur identification and image restorationCombination of segmentation and classificationCombination of blur and non-blur imageN...
3 votes
3 votes
1 answer
4