49 views
Suppose that data are transmitted in blocks of sizes 1000 bits. What is the maximum
error rate under which error detection and retransmission mechanism (1 parity bit per
block) is better than using Hamming code? Assume that bit errors are independent of
one another and no bit error occurs during retransmission.

edited | 49 views

In Hamming code we require 10 parity bits

$2^P\geq P+M+1$ where M=1000

In Hamming code we require only one transmission (this is Forward Error Correcting Code). on detection of 1 bit error the receiver will try to guess the correct message

So we're transmitting 1010 bits per block basis(message bits+redundant bits)

in case of error detection and retransmission let the error rate be x per bit i.e. probability that a bit got corrupted=x

in error detection and retransmission, we transmitting 1001 bits per block basis(1000+1 parity bit)

for a block of 1000 bits there could be 1000x bit errors

for example if x=0 i.e. probability of a bit getting corrupted is nil, there there will be no retransmission required

If x=0.1 so in a block 0.1*1000=100 bits are inverted so there will be 100 retransmissions+1 beginning transmission. And we've to transmit 1001+100*1001 bits

if x=1 then 1*1000 all bits are inverted and there will be 1000 retransmissions of 1001 bits

for every bit error in a block 1001 bits will be retransmitted. so there will be 1000x*1001 bit retransmission+1 transmission of 1001 bits in the beginning

so we're transmitting 1001+1000x*1001 bits

if error detection and correction is better than Hamming code then

1001+100x*1001 < 1010

x<$9*10^{-6}$ approx

Thus, a bit should have less than $9*10^{-6}$ probability of getting inverted
by Loyal (5.2k points)
edited by