599 views
1 votes
1 votes
It has been observed that the number of instructions executed between page faults is directly proportional to the number of page frames allocated to a program. If the available memory is doubled, the mean interval between page faults is also doubled. Suppose that a normal instruction takes $1\: microsec,$ but if a page fault occurs, it takes $2001\: \mu sec (i.e., 2\: msec)$ to handle the fault. If a program takes $60\: sec$ to run, during which time it gets $15,000$ page faults, how long would it take to run if twice as much memory were available?

1 Answer

1 votes
1 votes

Time spent in executing program = 60 – 15000*2*10^(-3) = 30 seconds

For 15000 page faults, the overhead at 2 msec per fault is 30 sec.

So, out of the 60 sec, 30 went in paging overhead and 30 for process running.

If the process had twice the memory, we have half, i.e., 7500 page faults and 15 sec paging overhead.

The total time for the process is 30+15=45sec.

edited by

Related questions

0 votes
0 votes
0 answers
2
admin asked Oct 26, 2019
306 views
A machine-language instruction to load a $32-bit$ word into a register contains the $32-bit$ address of the word to be loaded. What is the maximum number of page faults t...