2,971 views
Consider a $100 \; \text{Mbps}$ link between an earth station (sender) and a satellite (receiver) at an altitude of $2100 \; \text{km}.$ The signal propagates at a speed of $3 \times 10^{8} \; \text{m/s.}$ The time taken $\text{(in milliseconds,} \; \textit{rounded off to two decimal places})$ for the receiver to completely receive a packet of $1000 \; \text{bytes}$ transmitted by the sender is _______________.

### 1 comment

The answer mentioned in rank predictor application for this question need to be updated, as in the rank predictor, answer mentioned is 7.0-7.1.

Given,

Bandwidth (Bw) $= 100$ Mbps

distance (d) $= 2100$ Km $= 2100 \times 10^3$ m

velocity (v) $= 3 \times 10^8$ m/s

packet size (L) = $1000$ bytes $= 8000$ bits

Here, sender is station and receiver is satellite

So, Total time $=$ Transmission time $+$ Propagation time

Transmission time $= \frac{L}{Bw} = \frac {8000\ b}{100 \times 10^6\ b/sec} = 0.08$ ms

Propagation time $= \frac{d}{v} = \frac {2100 \times 10^3\ m}{3 \times 10^8 \ m/sec} = 7$ ms

$\therefore$ Total time $= 0.08\ + \ 7 = 7.08$ ms

why are we not adding time from sender to satellite + satellite to receiver= 0.8+7+7=14.08
I have done the same mistake…...14.08 by thinking receiver as another station in earth. Always read question carefully twice. It will cost me -30 GATE score behind.
did same mistake :/
Size of the packet= 1000 bytes=1000*8=8000 bit

Distance between sender and receiver=2100km=2100*1000 m

signal propagation speed=$3 *10^{8}$ m/s.

Bandwidth=100 Mbps

Transmission time=packet size/bandwidth

=($\frac{8000}{(100*10^{6})}$)=0.08 ms

Propagation time=Distance between sender and receiver/ signal propagation speed

= $\frac{21*10^{5}}{(3*10^{8})}$

=7 ms

Total time taken by the receiver to receive the packet completely is transmission time + propagation time

=7+0.08 =7.08 ms

so the answer is 7.08 ms.