retagged by
951 views
2 votes
2 votes
a hard drive with a maximum transfer rate of 1 Mbytes per second is connected to a 32bit ,10MIps CPU operating at a clock frequency of 100megahertz .Assume that the IO interface is DMA based and it takes 500 clock cycles for the CPU to set up the DMA controller also assume that the interrupt handling process at the end of the DMA transfer takes an additional 300 CPU clock cycles. If the data transfer is done using 2KB blocks, calculate the percentage of CPU TIME consumed in Handling the hard drive.

Plz explain the answer.
retagged by

1 Answer

Best answer
6 votes
6 votes

Hard drive with a maximum transfer rate of 1 Mbytes per second so that means

1MB transfer in 1 sec

1 B transfer in 1/106 sec

2KB transfer in (2*103 / 106 )  * 106 microsec= 0.002 * 106 microsec = 2000 microsec,  this is our transfer time ...(i)

Now, total number of cycles are ( 300 + 500) = 800 

CPU operating at a clock frequency of 100 megahertz , now 100 MHz = 100 * 106 Hz = (1/108) * 106 microsec = 1/100 microsec

so , total cycle time is 800* (1/100) microsec = 8 microsec ....(ii)

so , the percentage of CPU TIME consumed in Handling the hard drive is( assuming here burst mode is used)

=  8/(8+2000) * 100 

= (8/2008) * 100

= 0.003984 * 100 

= 0.3984%

selected by
Answer:

Related questions

1 votes
1 votes
1 answer
1
2 votes
2 votes
1 answer
3
Avir Singh asked Nov 20, 2018
2,299 views
Consider a device of 1MBPS is operating in a cycle stealing mode of DMA .Whenever 16 B word is available it is transferred into memory in 4 microseconds. What is the % of...
1 votes
1 votes
2 answers
4
AmitPatil asked Feb 2, 2017
614 views