A TCP machine is sending windows of 65,535 bytes over a 1-Gbps channel that has a 10-msec one-way delay. What is the maximum throughput achievable? What is the line efficiency?
Answer:
One window can be sent every 20 msec. This gives 50 windows/sec, for a maximum data rate of about 3.3 million bytes/sec. The line efficiency is then 26.4 Mbps/1000 Mbps or 2.6%.
But they have taken 20 msec as RTT and in the question they have given 10 msec one-way delay. Why they have taken RTT? why we can't use one-way delay itself ?