retagged by
19,030 views
57 votes
57 votes
Consider a simple communication system where multiple nodes are connected by a shared broadcast medium (like Ethernet or wireless). The nodes in the system use the following carrier-sense based medium access protocol. A node that receives a packet to transmit will carrier-sense the medium for $5$ units of time. If the node does not detect any other transmission, it starts transmitting its packet in the next time unit. If the node detects another transmission, it waits until this other transmission finishes, and then begins to carrier-sense for $ 5$ time units again. Once they start to transmit, nodes do not perform any collision detection and continue transmission even if a collision occurs. All transmissions last for $20$ units of time. Assume that the transmission signal travels at the speed of $10$ meters per unit time in the medium.

Assume that the system has two nodes $P$ and $Q$, located at a distance $d$ meters from each other. $P$ start transmitting a packet at time $t=0$ after successfully completing its carrier-sense phase. Node $Q$ has a packet to transmit at time $t=0$ and begins to carrier-sense the medium.

The maximum distance $d$ (in meters, rounded to the closest integer) that allows $Q$ to successfully avoid a collision between its proposed transmission and $P$’s ongoing transmission is _______.
retagged by

5 Answers

Best answer
90 votes
90 votes
P starts transmission at $t=0$. If P's first bit reaches Q within Q's sensing window, then Q won't transmit and there shall be no collision.

Q senses carrier till $t=5$.  At $t=6$ it starts its transmission.

If the first bit of P reaches Q by $t=5$, collision can be averted. Since signal speed is $10$  m/time (given), we can conclude that max distance between P and Q can be $50$ meters.
71 votes
71 votes
Here the vulnerable time is tp where tp is the propagation delay.

P and Q are separated by a distance of 'd' meters. When P starts transmission its first bit will take d/10 time units to reach the Q. If this first bit is caught by Q then Q can avoid collision.

Hence the first bit should reach Q within 5 time units so that Q detects it.

Therefore, d/10<=5 units

                d<=50 metres

Hence maximum distance is 50 metres.
25 votes
25 votes

I can model my given scenario as given below

So, P is done with carrier sensing and begins transmitting at t=0. Now, Q will sense the medium for 5 units of time till t=4 and begin transmitting at t=5.

If untill this time unit, the first bit of P's data reaches Q, Q will refrain from sending its own data. If it doesn't reach Q at t=4, at t=5, Q will start sending it's own data.

This means we have only 5 units of time till which we can wait for the first bit of signal of P to reach Q. And we know that this is nothing but one-way propagation delay ($T_p$).

Given the distance is d meters and the velocity of the signal is 10 meters per time unit, $T_p=\frac{d}{10}$ units of time.

Now,  as you can see in the image, the worst case till when we can wait for the P's signal is till 5 units of time.

So, my $T_p=5$ units of time

$5=\frac{d}{10}$

$d=50\,m$ Answer 

 

4 votes
4 votes

Consider the following diagram for reference-

Therefore, the distance $d$ must be such that the packet from node $P$ reaches the node $Q$ withing $5 \text { units time }\ (\because 25 -20 = 5)$.

Given, the speed of the transmission signal is $10 \text { meters per unit time}$, i.e. $1 \text { unit time } \rightarrow 10 \text { meters } \\ 5 \text { units time } \rightarrow 5 \times 10 \text { meters } = 50 \text { meters }$

$\therefore$ The maximum distance is given by $d_{max} = 50 \text { meters }$

Answer:

Related questions