The Gateway to Computer Science Excellence
+2 votes
138 views

Can anybody explain in brief how to solve such numericals.

in CO and Architecture by Active (1.7k points) | 138 views
0
In normal mode 1 sec = 20KB.

In interrupt, mode overhead is 10 microsec for each byte

Hence total time for 20KB = $10*10^{-6} * 20 * 10^3 = 200\ msec$

Hence performance achieved = $\frac {1}{200 \ msec}= 5$
0
Ashwin its 20 Kbps not 20 KBps , answer will be 40 .
0

My main issue is shouldn't the term "Interrupt Overhead" mean that it is an additional time apart from the time the device takes to transfer data? How will be the device be able to transfer 20Kb data in "Interrupt Overhead" time? Is this the same as explained in one of the Gate Overflow posts about the Data Preparation time and Data Transfer time?

0
Ohh yes @Anu sir. Thankyou :)
0
@Ashwin ,@Anu
why we need to calculate in Byte addressable format?
Can u help me this point?
why we first need to convert in Byte and then calculate?
+1
why are we not calculating the 1 sec that is required for transmitting the 20Kb data?

shouldn't the claculation be like this?

Interrupt time for 20Kb=((20*10^3)/8)*(10*10^-6)=1/40 sec

transmission time for 20Kb data=1 sec

so total time=1+1/40=41/40 sec

performance=40/41=0.97

Please log in or register to answer this question.

Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true
50,833 questions
57,742 answers
199,469 comments
108,058 users