11,481 views
5 votes
5 votes

 A TCP machine is sending windows of 65,535 bytes over a 1-Gbps channel that has a 10-msec one-way delay.  What is the maximum throughput achievable?  What is the line efficiency?

 

Answer:

One window can be sent every 20 msec.  This gives 50 windows/sec, for a maximum data rate of about 3.3 million bytes/sec.  The line efficiency is then 26.4 Mbps/1000 Mbps or 2.6%.

 

But they have taken  20 msec as RTT and in the question they have given 10 msec  one-way delay. Why they have taken RTT? why we can't use one-way delay itself ?

2 Answers

Best answer
9 votes
9 votes
given L= 65535 * 8 bits

BW = 1*10^9 ,RTT =20 msec

so bandwidth delay product = 1*10^9*20*10^-3 =20*10^6

but we can send only 65535 * 8 bits

so efficiency  =  65535 *8 / 20* 10^6 = 2.6 %

throughput =  efficiency *  BW
selected by
2 votes
2 votes
find throughput  , then divide with bandwidth
Answer:

Related questions

2 votes
2 votes
1 answer
3