For detecting a single bit error using CRC, it is needed that $x^{i}$ should not be divisible by g(x). So, we make g(x) of at least 2 terms, which renders a single term of e(x) indivisible. But then what is the logic behind keeping MSB as 1. Isn’t just keeping g(x) of any 2 terms just enough to make any single bit indivisible?
For example, $x^{3}+x^{2}$ is guarantees to detect a single bit error at any position. Is it not?