1,077 views
0 votes
0 votes
A TCP machine is sending full windows of 65,535 bytes over a 1-Gbps channel that
has a 10-msec one-way delay. What is the maximum throughput achievable? What is
the line efficiency?

1 Answer

1 votes
1 votes
One way delay=10 ms, RTT=20 ms

In 1 sec, we can send 1Gb of data=$10^9$ bits

In 20 ms we can send $\frac{20*10^9}{10^3}=20$ Mb of data

But we're sending only 65,535 Bytes of data [1 full window transmitted in 1 RTT]

Efficiency=$\frac{65535*8}{20*10^6}*100$=2.62%

Related questions

3 votes
3 votes
2 answers
3
1 votes
1 votes
2 answers
4