5.3k views

Suppose the round trip propagation delay for a $10\text{ Mbps}$ Ethernet having
$48\text{-bit}$ jamming signal is $46.4\ \mu s$. The minimum frame size is:

1. $94$
2. $416$
3. $464$
4. $512$
edited | 5.3k views

The sender must be able to detect a collision before completely sending a frame.
So, the minimum frame length must be such that, before the frame completely
leaves the sender any collision must be detected.

Now, the worst case for collision detection is when the start of the frame is about
to reach the receiver and the receiver starts sending. Collision happens and
a jam signal is produced and this signal must travel to the sender.
So, the time for this will be the time for the start of the frame to reach near
the receiver $+$ time for the jam signal to reach the sender $+$
transmission time for the jam signal.

(We don't need to include transmission time for the frame as as soon as
the first bit of the frame arrives, the receiver will have detected it). Time for the
start of the frame to reach near the receiver $+$ Time for the jam signal to
reach the sender $=$ Round trip propagation delay $= 46.4 \mu s$. So,

$46.4 +\dfrac{48}{10}\text{(48 bits at 10 Mbps takes 4.8 micro sec.) = 51.2 μs.}$

Now, the frame length must be such that its transmission time must be
more than $51.2 \mu s$.

So, minimum frame length$= 51.2\times 10^{-6}\times 10\times 10^6=512\text{ bits}$.

http://gatecse.in/w/images/3/32/3-MACSublayer.ppt

A reference question from Peterson Davie:

43.  Suppose the round-trip propagation delay for Ethernet is 46.4 $\mu s$.This
yields a minimum packet size of 512 bits (464 bits corresponding to
propagation delay + 48 bits of jam signal).

(a)   What happens to the minimum packet size if the delay time is held con-
stant , and the signaling rate rises to 100 Mbps?

(b)   What are the drawbacks to so large a minimum packet size?

(c)   If compatibility were not an issue, how might the specifications be written
so as to permit a smaller  minimum packet size?

Another reference for requiring jam signal bits to be included for minimum frame size.

http://intronetworks.cs.luc.edu/current/html/ethernet.html

Can collision be detected by the source without getting the full jam signal (by change in current)?

Probably yes. But to be safe (from signal loss) the source waits for the entire jam signal. See below link

http://superuser.com/questions/264171/collisions-in-csma-cd-ethernet

edited
why jam signal divide by 10 ..
48 bits are transmitted at 10 Mbps. So, takes 4.8 micro seconds.
Why you are inserting Jamming singal in computation ? Your reference does not support your hypothesis completely.

Page no 21 of PPT you gave us -> The longest time between starting to transmit a frame and receiving the first bit of a jam sequence is twice the propagation delay from one end of the cable to the other. •This means that a frame must have enough bits to last twice the propagation delay.

So we need to care only about getting first bit at other end to detect collusion. So why to add entire Transmission delay of Jamming signal ? I don't understand !

yes, you are correct as per that PPT. But this is given in page 162 here. I'm not sure of the reason behind.

http://www.pdfiles.com/pdf/files/English/Networking/Computer_Networks_A_Systems_Approach.pdf

How can question be reference to something I wonder .. In question we might assume anything !
@Akash, so, we shud consider jamming signal time or not ? what do u say?
@Akash See the question:

"Suppose the round-trip propagation delay for Ethernet is 46.4 µs. This yields a minimum packet size of 512 bits (464 bits corresponding to propagation delay + 48 bits of jam signal)."

I guess its straightforward here. If you say 1 bit is enough to identify collision, then why use 48 bits? - whatever be the reason, isn't it necessary to get the complete 48 bits then?
@Arjun Sir, is it right to assume that the transmission delay of the jamming signal needs to be included, since the sender needs to get the whole frame to know whether it is an ACK frame from the receiver, or the jamming signal. That way, we have the longest frame time to be twice the propagation delay plus the transmission delay of the jamming signal frame. If the transmission time of the jamming signal is negligible, or given to not be considered, then we take the longest frame time to be twice the propagation delay only. Right?
@Arjun Sir
Yes, this is fact that 1 bit is enough to identify collision. But it does not mean that we will not add jamming signal time.
Why we need to add Jamming time ? -    When B detects collision then B transmit jamming signal and the time we are adding is transmission time of Jamming signal at side B, Not the time to identify jamming signal at side A.

(Problem 2. Suppose nodes A and B are on the same 10Mbps....)
http://web.eecs.umich.edu/~zmao/eecs489/MT2/mt2.html
Solution of this question : http://web.eecs.umich.edu/~zmao/eecs489/MT2/mt2reviewHints.pdf
well explanation @Arjun sir :)
@ arjun sir, what happens if collsion happens midway between A and B?  Then who will send the jam signal??

@Arjun sir....why u r including jam bit?....we will definitly pad 48 bits to make it 512 b ...but these 48 b shouldnt be jam bits..."whenever there is a collision all stations stops transmitting the frame and sends jamming signal to stop further transmssions"....this means that jam bits are not part of ethernet frame....they come into picture only after collision takes place...answer is 512 but i think we shouldnt say we are adding 48b of jam bits here..

Bhagwan ke liye Arjun sir ka peecha chhod do for this question :P
But Round Trip Time is the time the signal needs to come back to sender and when the signal came back to the sender it is obvious that the collision has occurred. Then why we need to consider jamming time also. I am not clear with this technique. Can anyone explain it?
But sir jamming signal doesn't affect the minimum packet size.

https://superuser.com/questions/264171/collisions-in-csma-cd-ethernet

Read this it should clear the concept.

But it is mentioned in Frozen that minimum frame transmission time is atleast two times the propagation delay.So is it partial statement?

Because looking at the discussion it seems min. frame transmission time is atleast two times the propagation delay+time to transmit jamming signal

@rahul

frame transmission time is atleast two times the propagation delay+time to transmit jamming signal

Only in this question I have seen this, rest all questions

frame transmission time is atleast two times the propagation delay

But, in those questions there is no explicit mention of jamming signal.

+1 vote
ans c)

See here

We have to count the time for jamming signal also. So, it should be 512.

@Arjun sir, but jamming signals and data are propagated together in the cable no? why need to take them seperately? if jamming signal time is greater than collision detection time, then we have to consider them seperately no?
First frame goes from A->B. Collision happens at B, jamming signal sent by B.

So, total time = 2 * propagation delay, one for frame and one for jam signal.

This is the time for the first bit of jamming signal to reach A. For A to identify collision all 48 bits must reach. And this time is the transmission time.
Arjun Sir, why sender looking all 48 bits of jam signal? To identify collision only need to jamming signal reach at the sender (that means it is enough to transmit a packet for minimum 2*propagation delay)
How can it know that it is jamming signal unless whole 48 bits are received? To identify something is 'x' the whole 'x' must be received- unless there is some identification stuff- which isn't mentioned in the question.
Arjun Sir, There is a confusion in my mind that is when frame goes from sender A to receiver  B and collision happen at B then energy level in the channel at B is high and B is immediately sense that collision has occurred. After that B is sending a jam signal(i.e. 48 bit) and jam signal also have some different energy level . So this energy level whenever sense by A it knows that it is a jam signal immediately.

Sir,Why the A need to be sense all 48 bits . Why not it knows immediately just sensing the energy level like at receiver B?
How can the energy level be sensed by A?

Given 48 bits is the jamming signal. So, we can have a case where 47 bits are same as jamming signal and that can be part of a valid data.
@Arjun, Issue here is that

Valid data can not come on Channel while we are transmitting data, while sending data, channel should sound Idle to sender i think.

Because no body else is sending data other than sender at that moment, even 1 bit coming from other direction means that collusion is happened !
are transmission time for frame and jamming signal same?as you have calculated transmission time for 48 bits where we needed to do for the frame?
if we consider a scenario in which A and B starts transmitting data packet at the same time. Then obviously collison will happen at the middle that is after half of Tp(propagation delay) time. Then collison signal will move to both A and B and when they detect collision again after half of Tp time, both will send jamming to inform other stations on the ethernet about the collision. So how can A and B depend on the jamming signal to detect the collision??
hey first read the question....i dunt know what you all are trying to do but simple solution is that-

to detect collision in a ethernet of 10Mbps bandwidth(which is general case) minimum frame size must be of 64 bytes or 512 bits and maximum is 1518bytes or 12144 bits (theoretically)

for a successful detection of collision it should be the case that :

\begin{align*} \text{Transmission time} &\geq \text{RTT}\\ \frac{f+48}{10\times 10^6} &\geq 46.4\mu\\ f+48 &\geq 464\\ f &\geq 416 \end{align*}

Minimum packet size = 416 bits.