706 views

For detecting a single bit error using CRC, it is needed that $x^{i}$ should not be divisible by g(x). So, we make g(x) of at least 2 terms, which renders a single term of e(x) indivisible. But then what is the logic behind keeping MSB as 1. Isn’t just keeping g(x) of any 2 terms just enough to make any single bit indivisible?

For example, $x^{3}+x^{2}$ is guarantees to detect a single bit error at any position. Is it not?

x3+x2  does not  guarantees to detect a single bit error at any position as only one condition is satisfied  i.e.  g(x) of at least 2 terms,  but the condition MSB as 1 should also satisfy.

### 1 comment

You are going wrong somewhere.

Yes, $x^{3} + x^{2}$ is very much able to detect any single bit error, even without MSB as 1.

The thing is that, if we look at it, this also contains 1 as the MSB, as shown;

$x^{3} + x^{2} = x^{2}(x + 1)$

So, all in all, doubt still stands na... if all the two terms are for sure to have 1 as the MSB after taking out common, what's the point of specifically requirng 1 as MSB.