edited by
169 views
2 votes
2 votes
Two hosts are connected via a packet switch with $2\times 10^7$ bits per second links. Each link has a propagation delay of $30$ microseconds. The switch begins forwarding a packet $40$ microseconds after it receives the same. If $13000$ bits of data are to be transmitted between the two hosts using a packet size of $6000$ bits. The time elapsed between the transmission of the first bit of data and the reception of the last bit of the data in microseconds is ___________
edited by

1 Answer

3 votes
3 votes
For $13000$ bits of data using packet size of $6000$ bits we need $\left\lceil \dfrac{13000}{6000} \right\rceil = 3$ packets where the last packet is of size $1000$ bits and others of size $6000$ bits.

The last packet will start transmitting from switch as soon as the transmission of the other packets are done (without waiting for them to reach the destination)

So, total time = Transmission times of first packet +Propagation time for first link
+ Switch Delay + Sum of transmission times of all but the last packet (for transmission from switch) + Transmission time of last packet (from switch)+
+ propagation time for second link.

Transmission time for $6000$ bits packet $ = \dfrac{6000}{2\times 10^7} = 300\; \mu s.$

Transmission time for $1000$ bits packet $ = 300/6 = 50 \;\mu s.$

So, total time $ = 300+30+40+ 2\times  300 +50+ 30  = 1050\; \mu s. $
edited by
Answer:

Related questions

4 votes
4 votes
1 answer
4