768 views
0 votes
0 votes

Suppose 1000 clients are trying to download a 10MB file from a server that has 100Mbps access link. The clients have 2Mbps downstream access rate. What is the time taken by the server to send the file to all clients.(ignore TCP connection time and assume ideal conditions exist.)( Marks: -0.33 )

  1.   5 sec
  2.   40 sec
  3.   800 sec
  4.   0.8 sec

--------------------------------------------------------------------------------------------------------------

Total number of bits sent by server = 10M * 8 * 1000=80 gigabits
So transmission time = 80 G/(100-2)Mbps

or

if we send to all the links at the same time then transmission time will be 10MB/Bnadwidth.

bandwdith taken will be 2 Mbps or (100-2)Mbps??

1 Answer

Best answer
0 votes
0 votes

Total capacity of link is 100Mbps.

Total no of clients=1000.

If we divide entire bandwidth to each client then each one is having capacity=100Mbps/1000=0.1Mbps.

Server downstream is 2Mbps and client capacity is only 0.1Mbps.

If server send at the rate of 2Mbps packets are lost.So send according to minimum rate then only all packets are safely reach to client.

SO finally bandwidth from server to client is 0.1Mbps.

File size we want to download is 10MB.

Total time=10MB/0.1Mbps

               =$\frac{10*10^{6}*8}{0.1*10^{6}}$

              =800sec.

selected by

Related questions

1 votes
1 votes
1 answer
3
worst_engineer asked Oct 2, 2015
4,553 views
An image is $1600*1200$ pixels with 3 bytes/pixel .Assume that the image is incompressible.The time taken to transmit the image over a gigabit ethernet (in ms) is _______...