Ethernet uses CSMA/CD.
In CSMA/CD, when a collision is detected the involved stations must immediately stop transmitting, so as to not waste bandwidth and time. The collided messages are garbled irretrievably.
Any station that wants to transmit a frame, or is currently transmitting a frame must continuously keep sensing the channel — to detect if the channel is idle and to detect collisions respectively.
The stations keep sensing the channel while transmitting. When their "sensed" data does not resemble the data they're sending; they know collision has occured. But sometimes, for example, when a collison of 0 Volts occurs, it is impossible to detect collision. Hence, we use a maximum 48-bit jamming signal which announces the occurrence of collision
Options A, B and C are gone.
Any algorithm that's meant for minimising collisions obviously reduces the chances of collisions. :P
So, D is correct.
Well, the binary exponential back-off algorithm makes the colliding stations choose a random number between $0 \ to \ 2^i-1$ where $i$ is the number of collisions. This random number is the number of slots the stations will skip.
Practically, after $i=2$, chances of collision are negligible.