34,126 views

Suppose the round trip propagation delay for a $10\text{ Mbps}$ Ethernet having $48\text{-bit}$ jamming signal is $46.4\ \mu s$. The minimum frame size is:

1. $94$
2. $416$
3. $464$
4. $512$

1 comment

How Jamming Signal and Length of Packet is dependent?

Tt>=2*Tp

so, L>=2*Tp*Bandwidth

The jam signal or jamming signal is a signal that carries a 32-bit binary pattern sent by a data station to inform the other stations of the collision and that they must not transmit.

Means sender already knows that collision has occurred than only he send Jamming signal to others.

Our motive here to let sender know about collision, which he already knew before sending Jamming signal.

So don't count Jamming signal time while calculating frame size.

Frame size must be = 208

Given RTT = 46.4 μs, B.w. = 10 Mbps
Round trip propagation delay is RTT = 2*Tp
Minimum frame size of Ethernet can be found by using formula Tt = 2*Tp
Let L is minimum frame size. Then L / 10Mbps = 46.4 μs
L=464 Kbits
It has nothing to do with jamming signal.
Tt >= 2*td + time for sending jamming signal

RTT = 2* Td =46.6 µs

time for sending jamming signal = 48 bits / 10Mbps =4.8 µs

Now,

Tt >= 2*td + time for sending jamming signal

Tt >= 46.6 µs + 4.8 µs

Tt >= 51.2 µs

L >= 51.2 µs * 10 Mbps

L >= 512
by
hey first read the question....i dunt know what you all are trying to do but simple solution is that-

to detect collision in a ethernet of 10Mbps bandwidth(which is general case) minimum frame size must be of 64 bytes or 512 bits and maximum is 1518bytes or 12144 bits (theoretically)

so answer (D) 512

1 comment

please correct the option below...the answer is C..

Transmission time>=2 * propogation time

length =46.4*10=464 bits.