in Computer Networks edited by
43,935 views
77 votes
77 votes

Suppose the round trip propagation delay for a $10\text{ Mbps}$ Ethernet having $48\text{-bit}$ jamming signal is $46.4\ \mu s$. The minimum frame size is:

  1. $94$
  2. $416$
  3. $464$
  4. $512$
in Computer Networks edited by
43.9k views

2 Comments

How Jamming Signal and Length of Packet is dependent?

Tt>=2*Tp

so, L>=2*Tp*Bandwidth
3
3

Source: Forouzan 

 

1
1

8 Answers

129 votes
129 votes
Best answer

The sender must be able to detect a collision before completely sending a frame.
So, the minimum frame length must be such that, before the frame completely leaves the sender any collision must be detected.

Now, the worst case for collision detection is when the start of the frame is about to reach the receiver and the receiver starts sending. Collision happens and a jam signal is produced and this signal must travel to the sender.

The time for this will be the time for the start of the frame to reach near the receiver $+$ time for the jam signal to reach the sender $+$ transmission time for the jam signal.

(We do not need to include transmission time for the frame as as soon as the first bit of the frame arrives, the receiver will have detected it). Time for the start of the frame to reach near the receiver $+$ Time for the jam signal to reach the sender $=$ Round trip propagation delay $= 46.4 \mu s$. So, 

$46.4 +\dfrac{48}{10}\text{(48 bits at 10 Mbps takes 4.8 micro sec.) = 51.2 μs.}$

Now, the frame length must be such that its transmission time must be
more than $51.2 \mu s$.

So, minimum frame length$= 51.2\times 10^{-6}\times 10\times 10^6=512\text{ bits}$.

A reference question from Peterson Davie:

43.  Suppose the round-trip propagation delay for Ethernet is $46.4 \mu s$. This yields a minimum packet size of $512$ bits $(464$ bits corresponding to propagation delay $+ 48$ bits of jam signal$).$

  1. What happens to the minimum packet size if the delay time is held constant, and the signaling rate rises to $100$ Mbps?
  2. What are the drawbacks to so large a minimum packet size?
  3. If compatibility were not an issue, how might the specifications be written so as to permit a smaller minimum packet size?

Another reference for requiring jam signal bits to be included for minimum frame size.

Can collision be detected by the source without getting the full jam signal (by a change in current)?

Probably yes. But to be safe (from signal loss) the source waits for the entire jam signal. See below link

Correct Answer: D.

edited by
by

33 Comments

why jam signal divide by 10 ..
0
0
48 bits are transmitted at 10 Mbps. So, takes 4.8 micro seconds.
2
2
edited by
Why you are inserting Jamming singal in computation ? Your reference does not support your hypothesis completely.

Page no 21 of PPT you gave us -> The longest time between starting to transmit a frame and receiving the first bit of a jam sequence is twice the propagation delay from one end of the cable to the other. •This means that a frame must have enough bits to last twice the propagation delay.

So we need to care only about getting first bit at other end to detect collusion. So why to add entire Transmission delay of Jamming signal ? I don't understand !
10
10

yes, you are correct as per that PPT. But this is given in page 162 here. I'm not sure of the reason behind. 

http://www.pdfiles.com/pdf/files/English/Networking/Computer_Networks_A_Systems_Approach.pdf

0
0
How can question be reference to something I wonder .. In question we might assume anything !
0
0
@Akash, so, we shud consider jamming signal time or not ? what do u say?
0
0
@Akash See the question:

"Suppose the round-trip propagation delay for Ethernet is 46.4 µs. This yields a minimum packet size of 512 bits (464 bits corresponding to propagation delay + 48 bits of jam signal)."

I guess its straightforward here. If you say 1 bit is enough to identify collision, then why use 48 bits? - whatever be the reason, isn't it necessary to get the complete 48 bits then?
7
7
@Arjun Sir, is it right to assume that the transmission delay of the jamming signal needs to be included, since the sender needs to get the whole frame to know whether it is an ACK frame from the receiver, or the jamming signal. That way, we have the longest frame time to be twice the propagation delay plus the transmission delay of the jamming signal frame. If the transmission time of the jamming signal is negligible, or given to not be considered, then we take the longest frame time to be twice the propagation delay only. Right?
0
0
@Arjun Sir
Yes, this is fact that 1 bit is enough to identify collision. But it does not mean that we will not add jamming signal time.
Why we need to add Jamming time ? -    When B detects collision then B transmit jamming signal and the time we are adding is transmission time of Jamming signal at side B, Not the time to identify jamming signal at side A.

One more good question about this concept:
(Problem 2. Suppose nodes A and B are on the same 10Mbps....)
http://web.eecs.umich.edu/~zmao/eecs489/MT2/mt2.html
Solution of this question : http://web.eecs.umich.edu/~zmao/eecs489/MT2/mt2reviewHints.pdf
17
17
well explanation @Arjun sir :)
0
0
@ arjun sir, what happens if collsion happens midway between A and B?  Then who will send the jam signal??
0
0

@Arjun sir....why u r including jam bit?....we will definitly pad 48 bits to make it 512 b ...but these 48 b shouldnt be jam bits..."whenever there is a collision all stations stops transmitting the frame and sends jamming signal to stop further transmssions"....this means that jam bits are not part of ethernet frame....they come into picture only after collision takes place...answer is 512 but i think we shouldnt say we are adding 48b of jam bits here..

3
3
But Round Trip Time is the time the signal needs to come back to sender and when the signal came back to the sender it is obvious that the collision has occurred. Then why we need to consider jamming time also. I am not clear with this technique. Can anyone explain it?
0
0
But sir jamming signal doesn't affect the minimum packet size.
1
1

@Bad_Doctor

https://superuser.com/questions/264171/collisions-in-csma-cd-ethernet

Read this it should clear the concept.

6
6
But it is mentioned in Frozen that minimum frame transmission time is atleast two times the propagation delay.So is it partial statement?

Because looking at the discussion it seems min. frame transmission time is atleast two times the propagation delay+time to transmit jamming signal

@Arjun sir.Please confirm
0
0

@rahul

frame transmission time is atleast two times the propagation delay+time to transmit jamming signal

Only in this question I have seen this, rest all questions

frame transmission time is atleast two times the propagation delay

But, in those questions there is no explicit mention of jamming signal.

7
7

@arjun sir here i am unable to understand why the 96 is added for A to be start transmission ...

4
4
edited by
Must read from the link share by @VS Loyal to clear doubts
0
0

 

Try ro read the highlighted portion .

1
1
edited by

@Shaik Masthan

What if collision happened in midway ..then who will send jam signal ??..

Is jam signal only way such that sender and reciever came to know about collision .even if collision happen in midway

0
0

https://en.wikipedia.org/wiki/Carrier-sense_multiple_access_with_collision_detection#Jam_signal

http://www.thenetworkencyclopedia.com/entry/jam-signal/

from these links, i understood jam signal should be transferred to indicate collision happens.

So,

T$_t$ $\ge$ 2T$_t$ + T$_{trans\;of \;jam}$

1
1

@Shaik Masthan suppose if the collision occurred at the end(start of other sender B)then we have to consider propagation delay of sender A plus the transmission and propagation delay of jamming signal( Tp + Tj) right ??

Also if the collision happens at the center shouldn't this be half of tp+tj?

0
0

@Hemanth_13

@Shaik Masthan

If collision happens in between then who will send jam signal ??..the one who detect collision first right ?

But in that case both will detect collsion at same time so why jam signal transmission time  need to included in that case ?

1
1

"If a collision is heard, both of the senders will send a jam signal over the Ethernet. This jam signal indicates to all other devices on the Ethernet segment that there has been a collision, and they should not send data onto the wire. (A second indication of a collision is the noise created by the collision itself.) "

Ref link: http://www.mcmcse.com/cisco/guides/csma.shtml

"If collision happens in between then who will send jam signal ??..the one who detect collision first right ?"

You are right both will transmit the jamming signal.

"But in that case both will detect collision at same time so why jam signal transmission time  need to included in that case ?"

This is because all the stations might not be in b/w the two senders if there were any stations which are not in b/w these sender they should also be known about the collision so for this reason the jamming should definitely sent.

0
0

second indication of a collision is the noise created by the collision itself.) "

Means if collision happens in between (midway) ..it is detected by NOISE on channel..right?

This is because all the stations might not be in b/w the two senders if there were any stations which are not in b/w these sender they should also be known about the collision so for this reason the jamming should definitely sent.

I agree with this but..actually  we are doing Tt > time sender detect collision and as it is happening becoz of NOISE as you said why to transmit Jam signal within Tt...sender came to know about collision we want Tt > that much only...others may not get informed yet...

In worst case as taken in solution...NOISE won't be on channel for long time ..suppose B send 1 bit and detect noise and stop transmitting...so A will not detect Noise..hence collision detected by A after it receive JAM signal sent by B

Please correct if wrong

0
0
Aren't "time for the jam signal to reach the sender" and "transmission time for the jam signal" the one and same thing ?
0
0

"time for the jam signal to reach the sender" is the propogation time for the jam signal

0
0
YOUR ANSWER IS WRONG, BECAUSE JAMMING SIGNAL ALERT ALL OTHER STATION NOT TO SEND DATA.IT IS NOT USED FOR ACKNOWLEDGEMENT
0
0

“The purpose of this is to ensure that any other node which may currently be receiving a frame will receive the jam signal in place of the correct 32-bit MAC CRC, this causes the other receivers to discard the frame due to a CRC error.”

This quote is from wikipedia, this says

WE SHOULD INCLUDE JAMMING SIGNAL COMPLETELY AND NOT PARTIALLY TO DETECT THAT COLLISION HAS BEEN OCCURED means complete jamming singnal will lead to create different  crc which will result in to collision detection 

https://en.wikipedia.org/wiki/Carrier-sense_multiple_access_with_collision_detection#Jam_signal

0
0
edited by

that's why minimum frame size is 64 bytes  ..

0
0

@Sachin Mittal 1 , that's a wrong a of thinking ... Sender may not detect the collision in the very first bit , the jamming signal must be on the medium long enough to detect collision ,, practically within 48 bit times se der can detect the collision... It may be at 2 bit times or 3 bit times or 28 bit times .... You can't guarantee that collision will be happens at very first bit .... 

0
0
20 votes
20 votes

We know, 

Frame size>2*B*Propagation delay

Hence, 

Frame size >10*10^6*46.4*10^(-6)

Frame size min=464 bits

But in question its mentioned that 48 bits jamming signal is present.

Hence, 

Frame size min= 464+48=512 bits

Hence correct option is D

2 votes
2 votes
ans c)

4 Comments

are transmission time for frame and jamming signal same?as you have calculated transmission time for 48 bits where we needed to do for the frame?
0
0
if we consider a scenario in which A and B starts transmitting data packet at the same time. Then obviously collison will happen at the middle that is after half of Tp(propagation delay) time. Then collison signal will move to both A and B and when they detect collision again after half of Tp time, both will send jamming to inform other stations on the ethernet about the collision. So how can A and B depend on the jamming signal to detect the collision??
0
0
@Arjun sir  The  noise created by the collision will reach the sender in Tp time and thus why does it need to wait for the complete Jamming signal when it can detect the collision by the noise itself?
When the collision happens mid-way then both the sender and receiver get to know about the collision from the noise and then it send the Jamming signal. why can't we apply the same logic here?
0
0
2 votes
2 votes
Transmission delay>= 2*Prropagation delay

L/B.W=46.4 microsec

L=46.4 microsec*10Mbps

L=464b

note:-Use of Jamming signal:-If a collision is heard, both of the senders will send a jam signal over the Ethernet. This jam signal indicates to all other devices on the Ethernet segment that there has been a collision, and they should not send data onto the wire. After sending the jam signal, each of the senders will wait a random amount of time(decided by backoff algorithm) before beginning the entire process over. The random time helps to ensure that the two devices don't transmit simultaneously again.
Answer:

Related questions