For a host machine that uses the token bucket algorithm for congestion control, the token bucket has a capacity of 10 megabyte and the maximum output rate is 20 megabytes per second. Tokens arrive at a rate to sustain output at a rate of 10 megabytes per second. The token bucket is currently full and the machine needs to send 120 megabytes of data. The minimum time required to transmit the data is ______________ seconds.
The above question is from Made Easy test series and the answer according to them is 12 seconds.
But I am getting 11 seconds.
My solution:
Capacity = 10 megabyte
Maximum output rate = 20 megabytes per second
Rate at which tokens are arriving = 10 megabytes per second.
Burst time = 10 /(20 -10) = 1 second
Formula used C + R*t = M*t (C=Bucket Capacity, R=Token Generating Rate, t= Burst Time, M= Maximum Output Rate)
So for 1 second system will generate tokens at full speed of 20 megabytes per second and after that it will switch to 10 megabytes per second.
In 1 second at speed of 20 MBps, 20 megabytes will be sent.
Remaning 120 -20 = 100 megabytes
So it will take 100megabytes/10 MBps = 10 sec
So total time = 1 + 10 = 11 seconds
Please verify my solution