edited by
42,830 views
100 votes
100 votes
For a host machine that uses the token bucket algorithm for congestion control, the token bucket has a capacity of $1$ $\text{megabyte}$ and the maximum output rate is $20$ $\text{megabytes}$ per $\text{second}$. Tokens arrive at a rate to sustain output at a rate of $10$ $\text{megabytes}$ per $\text{second}$. The token bucket is currently full and the machine needs to send $12$ $\text{megabytes}$ of data. The minimum time required to transmit the data is _____________ $\text{seconds}$.
edited by

21 Answers

5 votes
5 votes
The token bucket is currently full so, time to send 10 MB = 1 second and one MB already full in bucket so total = 1+10=11 MB in one second, because maximum output rate is 20 MB given, Now remaining 1 MB. 10 MB= 1 second 1 MB= 1/10= 0.1 second total time 1 second for 11 MB + 0.1 second for 1 MB= 1.1 second for 12 MB so, the minimum time required to transmit the data is 1.1 seconds
2 votes
2 votes

First of all we need to understand why we needed token bucket when leaky bucket was working fine(Really?) :

(1)When the bucket became full, packets loss used to occur, but  in token bucket, token loss occurs and packets remain safe.

(2)Leaky bucket does not allow hosts to send bursts of data or in simpler terms when a host is idle, in leaky bucket a host cannot save up on bytes and send data at full throughput when data appears whereas in token bucket idle hosts can save up on permission to send large bursts of data for short periods of time.

Let us Understand with help of diagram. (Reference-Tannenbaum)

Fig 5.25(a) Shows host wants to send data at 25MB/sec for 40msec (1MB of bursty data)

Consider leaky bucket which has the output rate of 2MB/sec, the leaky bucket as you can see in Fig 5.25(b) smoothes out the data rate at 2MB/sec for 500msec.

Now consider a token bucket of capacity 250KB(Fig-5.25(c)) with token arriving at a rate allowing output at 2MB/sec- this means that the token arrival rate is 2MB/sec.

Assuming that token bucket is full when the 1MB burst arrives,

Now token bucket provides host the advantage that the host can transmit at full 25MB/sec for burst period say 'S' Sec.

This burst period is given by the expression

S = C/(M-R)

Where C=capacity of the bucket

M=Maximum allowable data rate(Here 25MB/sec)

R=Token arrival rate (Here 2MB/sec)

Now, for 'S' second, our host can transmit at full 25MB/sec.

Here S = 1MB/(23MBPS) = 10.869ms which is shown to be approximately 11ms in Fig 5.25(c)

In this 10.869 msec at 25MB/sec, total data sent= 271.739KB(Appx.)

Data remaining to b sent = 1MB-271.739KB=728.261KB

This data will now be sent evenly at the token arrival rate i.e. 2MB/sec.

And this will be sent for 728.261/2MB/sec = 364.1 msec which is shown as 364 msec in Fig 5.25(C).

So, the main difference here arises is that  the token bucket has allowed our host to send some part of data at full 25MB/sec which was not the case with leaky bucket and this was when the token bucket was full before our burst data of 1MB had arrived.

Now coming to this question : 

Capacity C=1MB

"Tokens are arriving at a rate to sustain output at rate of 10 mega bytes per second"

Means simply token arrival rate is 10MB/sec (R)

Maximum Output rate of the host is 20MB/sec.

It is said that the bucket is full.

So, when burst of 12MB data arrives,

For how much time considering that bucket was full before arrival of bursty data, our host is allowed to transmit at full 20MB/sec?

S = 1/(20-10)MBps = 0.1sec.

In 0.1 sec, data sent on network at 20MB/sec = 20*0.1=2MB.

Data remaining to be sent=12-2=10MB.

Now, this remaining 10MB data will be sent at the rate of token arrival rate i.e. 10MB/sec and this will take 1sec.

So, total time taken to send 12 MB data=1+0.1=1.1sec. (ANS)

Also, if the question had only asked that for what duration of time host is allowed to send at full capacity, then in this case answer would have been 0.1 sec. 

1 votes
1 votes
We need to calculate the time it takes to send 12mb and there is already 1mb present in the bucket.

The time it takes to receive 11mb more @10mbps is 1.1 sec

Now after 1.1 sec the 12th mb has entered the bucket but to send it, it will take additional time @20mbps

So additional time = 0.05sec

Therefore, total time taken to transmit 12mb = 1.1 + 0.05 = 1.15
1 votes
1 votes
C= 1MB

R =10MBps (20-10)

M =12 MB DATA

 

M =C+RT

12=1+10T

T=1.1

(M= Max no of packets, C= capacity,R=token generation rate, T= Time )
Answer:

Related questions

58 votes
58 votes
11 answers
6
Sandeep Singh asked Feb 12, 2016
26,620 views
A sender uses the Stop-and-Wait $\text{ARQ}$ protocol for reliable transmission of frames. Frames are of size $1000$ bytes and the transmission rate at the sender is $80\...
34 votes
34 votes
6 answers
7
Sandeep Singh asked Feb 12, 2016
18,407 views
Which one of the following protocols is NOT used to resolve one form of address to another one?$\textsf{DNS}$$\textsf{ARP}$$\textsf{DHCP}$$\textsf{RARP}$