+1 vote
319 views

Hi Guys,

In Ethernet whenever collision happens station sends a Jamming signal. But why does these special signal not collide ?
In many question i saw they also have TT and TP. So what does make them collision free ?

PS: Refer Problem 2 on http://web.eecs.umich.edu/~zmao/eecs489/MT2/mt2reviewHints.pdf

edited | 319 views
+3
From the point of collision, after collision, Jamming signals are sent towards respective sources. Hence they'll travel in opposite direction to each other. Hence they can't collide.
+4
though I don't the answer of this question, but 'sawal tumne badhiya puchha hai :D'
0
it does not matter whether jamming signal collide or not, jamming signals are just a high energy signals transmitted to ensure that everyone get alerted that a collision has happened somewhere, and no one should start transmission.
0
@Manu Thakur haha :p
0

@Ashwin Kulkarni ji Thanks for your effort.

travel in opposite direction to each other

is statement always guarantees that  collision will not happen in all the cases ?

+1

From the point of collision, after collision, Jamming signals are sent towards respective sources. Hence they'll travel in opposite direction to each other. Hence they can't collide.

@Ashwin Kulkarni , i think your statement is incorrect.

At the point of collision, we can't send jamming signal, because the collision is not yet detected, collision is detected when the signal reaches the node and once the node sense the energy level, it sends the jamming signal.

Collision detected by the transmitting node sends Jam signal, it's sent in all the direction, signal can't be sent towards a particular node.

The purpose of Jamming Signal is to indicate to all other devices on the Ethernet segment that there has been a collision, and they should not send data onto the wire.

The Node detects collision during the transmission, once the collision is detected, it aborts transmitting the data.

Jamming Signals can collide but this collision will not affect the nodes because nobody is transmitting at that time.

Note:

At the point of collision, we can't send jamming signal, because the collision is not yet detected, collision is detected when the signal reaches the node and once the node sense the energy level, it sends the jamming signal.

Collision detected by the transmitting nodes send Jam signal, it's sent in all the direction, signal can't be sent towards a particular node.

0

@Aakash_ so in this question , should I include jamming signal for finding the frame size? pls check

https://gateoverflow.in/1397/gate2005-74

0

meghna yes we have to include the jamming signal transmission delay.

$T_{t}(FrameTransmission Delay)= 2*Propagation Delay + JamSignalTransmission Delay$

0

@Aakash_

As per you example ..the collision is detected and then JAM signal is transmitted .

we know sender can stop transmitting when detect colliison ..then why u add Transmission time for jam signal in above condition..

0

when collision is detected, it stops transmitting the message and then it transmits a jamming signal, so that's why we are including the transmission time of jamming signal.

0
Tt should be such that colliding station will detect collision in that time .if collsion happens in midway how much time will be needed for colliding stations to detect collision ?
0

if collision happens midway, it'll be detected in tp time as shown in the image uploaded for answering this question.

0
So if collision detected in Tp time ..so our Tt should be greater than Time in which station detects collision..so Tt>=Tp
0

But in worst case, collision will be detected after 2*(Propagation Delay).

So Tt>= 2Tp +  Transmission Time of Jamming Signal

0

But in worst case, collision will be detected after 2*(Propagation Delay).

so Tt>= 2Tp right ?

0

yes and if are also considering jamming signal then

$T_{t}(FrameTransmission Delay)= 2*Propagation Delay + JamSignalTransmission Delay$