The Gateway to Computer Science Excellence
First time here? Checkout the FAQ!
x
+2 votes
976 views

Consider the conditional entropy and mutual information for the binary symmetric channel. The input source has alphabet $X=\{0,1\}$ and associated probabilities ${\frac{1}{2}, \frac{1}{2}}$. The channel matrix is $\begin{pmatrix} 1-p & p \\ p & 1-p \end{pmatrix}$ wgere p is the transition probability. Then the conditional entropy is given by:

  1. 1
  2. -plog(p)-(1-p)log(1-p)
  3. 1+plog(p)+(1-p)log(1-p)
  4. 0
in Digital Image Processing by Veteran (103k points)
recategorized by | 976 views

1 Answer

0 votes
by (75 points)
0

Ans  is B:  -plog(p)-(1-p)log(1-p)

Related questions

Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true
50,362 questions
55,786 answers
192,411 comments
90,919 users