The Gateway to Computer Science Excellence
0 votes
6 views
It has been observed that the number of instructions executed between page faults is directly proportional to the number of page frames allocated to a program. If the available memory is doubled, the mean interval between page faults is also doubled. Suppose that a normal instruction takes $1\: microsec,$ but if a page fault occurs, it takes $2001\: \mu sec (i.e., 2\: msec)$ to handle the fault. If a program takes $60\: sec$ to run, during which time it gets $15,000$ page faults, how long would it take to run if twice as much memory were available?
in Operating System by Veteran (58.8k points) | 6 views

Please log in or register to answer this question.

Related questions

Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true
50,737 questions
57,291 answers
198,209 comments
104,889 users