1,368 views
0 votes
0 votes

For a host machine that uses the token bucket algorithm for congestion control, the token bucket has a capacity of 10 megabyte and the maximum output rate is 20 megabytes per second. Tokens arrive at a rate to sustain output at a rate of 10 megabytes per second. The token bucket is currently full and the machine needs to send 120 megabytes of data. The minimum time required to transmit the data is ______________ seconds.

 

The above question is from Made Easy test series and the answer according to them is 12 seconds.

But I am getting 11 seconds.

My solution:

Capacity = 10 megabyte

Maximum output rate = 20 megabytes per second

Rate at which tokens are arriving = 10 megabytes per second.

Burst time = 10 /(20 -10) = 1 second 

Formula used  C + R*t = M*t   (C=Bucket Capacity, R=Token Generating Rate, t= Burst Time, M= Maximum Output Rate)

So for 1 second system will generate tokens at full speed of 20 megabytes per second and after that it will switch to 10 megabytes per second.

In 1 second at speed of  20 MBps, 20 megabytes will be sent.

Remaning 120 -20 = 100 megabytes

So it will take 100megabytes/10 MBps = 10 sec

So total time = 1 + 10 = 11 seconds

 

Please verify my solution

1 Answer

0 votes
0 votes
I dont know the formulae and all.

Actually In my understanding, the bucket is full. so at the 1st sec the previous bytes 10MB and 10MB of the 120 MB will be sent.

so the remaining will be 110MB which will be send in 11 secs.

so the tot is 12 secs.

Related questions

0 votes
0 votes
0 answers
2
2 votes
2 votes
1 answer
3
abheet asked Jul 31, 2022
1,283 views
Imagine a flow specification that has the maximum packet size 800 bytes, token bucket rate of $5 \times 10^{6}$ bytes/sec. Token bucket size is 1 million byte and the max...
0 votes
0 votes
2 answers
4