132 views

Can anybody explain in brief how to solve such numericals. | 132 views
0
In normal mode 1 sec = 20KB.

In interrupt, mode overhead is 10 microsec for each byte

Hence total time for 20KB = $10*10^{-6} * 20 * 10^3 = 200\ msec$

Hence performance achieved = $\frac {1}{200 \ msec}= 5$
0
Ashwin its 20 Kbps not 20 KBps , answer will be 40 .
0

My main issue is shouldn't the term "Interrupt Overhead" mean that it is an additional time apart from the time the device takes to transfer data? How will be the device be able to transfer 20Kb data in "Interrupt Overhead" time? Is this the same as explained in one of the Gate Overflow posts about the Data Preparation time and Data Transfer time?

0
Ohh yes @Anu sir. Thankyou :)
0
@Ashwin ,@Anu
why we need to calculate in Byte addressable format?
Can u help me this point?
why we first need to convert in Byte and then calculate?
+1
why are we not calculating the 1 sec that is required for transmitting the 20Kb data?

shouldn't the claculation be like this?

Interrupt time for 20Kb=((20*10^3)/8)*(10*10^-6)=1/40 sec

transmission time for 20Kb data=1 sec

so total time=1+1/40=41/40 sec

performance=40/41=0.97