273 views
0 votes
0 votes
A device with a data transfer rate of 20 KB/sec is connected to a CPU. Data is transferred byte-wise. Let the interrupt overhead be 10 microsecond

1. Total time required in programmed IO for 10 bytes data transfer?

2. Total time required in interrupt IO for 10 bytes data transfer?

3. What is the minimum performance gain of operating the device under interrupt mode over operating it under program-controlled mode?

Please log in or register to answer this question.

Related questions

2 votes
2 votes
1 answer
1
1 votes
1 votes
1 answer
3
iabhay.gupta asked Dec 11, 2022
656 views
Consider a hypothetical processor which supports expand opcode technique. A 32bit instruction is place in 256MW memory. If there exist 10, one address instructionthen how...