in Computer Networks edited by
44,048 views
77 votes
77 votes

Suppose the round trip propagation delay for a $10\text{ Mbps}$ Ethernet having $48\text{-bit}$ jamming signal is $46.4\ \mu s$. The minimum frame size is:

  1. $94$
  2. $416$
  3. $464$
  4. $512$
in Computer Networks edited by
44.0k views

2 Comments

How Jamming Signal and Length of Packet is dependent?

Tt>=2*Tp

so, L>=2*Tp*Bandwidth
3
3

Source: Forouzan 

 

1
1

8 Answers

129 votes
129 votes
Best answer

The sender must be able to detect a collision before completely sending a frame.
So, the minimum frame length must be such that, before the frame completely leaves the sender any collision must be detected.

Now, the worst case for collision detection is when the start of the frame is about to reach the receiver and the receiver starts sending. Collision happens and a jam signal is produced and this signal must travel to the sender.

The time for this will be the time for the start of the frame to reach near the receiver $+$ time for the jam signal to reach the sender $+$ transmission time for the jam signal.

(We do not need to include transmission time for the frame as as soon as the first bit of the frame arrives, the receiver will have detected it). Time for the start of the frame to reach near the receiver $+$ Time for the jam signal to reach the sender $=$ Round trip propagation delay $= 46.4 \mu s$. So, 

$46.4 +\dfrac{48}{10}\text{(48 bits at 10 Mbps takes 4.8 micro sec.) = 51.2 μs.}$

Now, the frame length must be such that its transmission time must be
more than $51.2 \mu s$.

So, minimum frame length$= 51.2\times 10^{-6}\times 10\times 10^6=512\text{ bits}$.

A reference question from Peterson Davie:

43.  Suppose the round-trip propagation delay for Ethernet is $46.4 \mu s$. This yields a minimum packet size of $512$ bits $(464$ bits corresponding to propagation delay $+ 48$ bits of jam signal$).$

  1. What happens to the minimum packet size if the delay time is held constant, and the signaling rate rises to $100$ Mbps?
  2. What are the drawbacks to so large a minimum packet size?
  3. If compatibility were not an issue, how might the specifications be written so as to permit a smaller minimum packet size?

Another reference for requiring jam signal bits to be included for minimum frame size.

Can collision be detected by the source without getting the full jam signal (by a change in current)?

Probably yes. But to be safe (from signal loss) the source waits for the entire jam signal. See below link

Correct Answer: D.

edited by
by

4 Comments

edited by

that's why minimum frame size is 64 bytes  ..

0
0

@Sachin Mittal 1 , that's a wrong a of thinking ... Sender may not detect the collision in the very first bit , the jamming signal must be on the medium long enough to detect collision ,, practically within 48 bit times se der can detect the collision... It may be at 2 bit times or 3 bit times or 28 bit times .... You can't guarantee that collision will be happens at very first bit .... 

0
0
20 votes
20 votes

We know, 

Frame size>2*B*Propagation delay

Hence, 

Frame size >10*10^6*46.4*10^(-6)

Frame size min=464 bits

But in question its mentioned that 48 bits jamming signal is present.

Hence, 

Frame size min= 464+48=512 bits

Hence correct option is D

2 votes
2 votes
ans c)

4 Comments

are transmission time for frame and jamming signal same?as you have calculated transmission time for 48 bits where we needed to do for the frame?
0
0
if we consider a scenario in which A and B starts transmitting data packet at the same time. Then obviously collison will happen at the middle that is after half of Tp(propagation delay) time. Then collison signal will move to both A and B and when they detect collision again after half of Tp time, both will send jamming to inform other stations on the ethernet about the collision. So how can A and B depend on the jamming signal to detect the collision??
0
0
@Arjun sir  The  noise created by the collision will reach the sender in Tp time and thus why does it need to wait for the complete Jamming signal when it can detect the collision by the noise itself?
When the collision happens mid-way then both the sender and receiver get to know about the collision from the noise and then it send the Jamming signal. why can't we apply the same logic here?
0
0
2 votes
2 votes
Transmission delay>= 2*Prropagation delay

L/B.W=46.4 microsec

L=46.4 microsec*10Mbps

L=464b

note:-Use of Jamming signal:-If a collision is heard, both of the senders will send a jam signal over the Ethernet. This jam signal indicates to all other devices on the Ethernet segment that there has been a collision, and they should not send data onto the wire. After sending the jam signal, each of the senders will wait a random amount of time(decided by backoff algorithm) before beginning the entire process over. The random time helps to ensure that the two devices don't transmit simultaneously again.
Answer:

Related questions