The sender must be able to detect a collision before completely sending a frame.
So, the minimum frame length must be such that, before the frame completely leaves the sender any collision must be detected.
Now, the worst case for collision detection is when the start of the frame is about to reach the receiver and the receiver starts sending. Collision happens and a jam signal is produced and this signal must travel to the sender.
The time for this will be the time for the start of the frame to reach near the receiver $+$ time for the jam signal to reach the sender $+$ transmission time for the jam signal.
(We do not need to include transmission time for the frame as as soon as the first bit of the frame arrives, the receiver will have detected it). Time for the start of the frame to reach near the receiver $+$ Time for the jam signal to reach the sender $=$ Round trip propagation delay $= 46.4 \mu s$. So,
$46.4 +\dfrac{48}{10}\text{(48 bits at 10 Mbps takes 4.8 micro sec.) = 51.2 μs.}$
Now, the frame length must be such that its transmission time must be
more than $51.2 \mu s$.
So, minimum frame length$= 51.2\times 10^{-6}\times 10\times 10^6=512\text{ bits}$.
A reference question from Peterson Davie:
43. Suppose the round-trip propagation delay for Ethernet is $46.4 \mu s$. This yields a minimum packet size of $512$ bits $(464$ bits corresponding to propagation delay $+ 48$ bits of jam signal$).$
- What happens to the minimum packet size if the delay time is held constant, and the signaling rate rises to $100$ Mbps?
- What are the drawbacks to so large a minimum packet size?
- If compatibility were not an issue, how might the specifications be written so as to permit a smaller minimum packet size?
Another reference for requiring jam signal bits to be included for minimum frame size.
Can collision be detected by the source without getting the full jam signal (by a change in current)?
Probably yes. But to be safe (from signal loss) the source waits for the entire jam signal. See below link
Correct Answer: D.