The Gateway to Computer Science Excellence
First time here? Checkout the FAQ!
x
+1 vote
2k views

Suppose that the number of instructions executed between page fault is directly proportional to the number of page frames allocated to a program. If the available memory is doubled, the mean interval between the page faults is also doubled. Further, consider that a normal instruction takes one microsecond, but if a page fault occurs, it takes 2001 microseconds. If a program takes 60 sec to run, during which time it gets 15,000 page faults, how long would it take to run twice as much memory were available?

  1. 60 sec
  2. 30 sec
  3. 45 sec
  4. 10 sec
asked in Operating System by Veteran (116k points)
recategorized by | 2k views

2 Answers

+2 votes
ans should be C   45 sec

Normal instruction takes 1 micro second 10^-6 sec

instruction with page fault takes 2001 micro seconds so p.f take 2000 microsecond

now given program takes 60 sec  and there were 15000 p.f  

so time taken by page faults= 15000x2000 micro seconds=30 sec

rest 30 sec are consumed by program execution

if  memory is doubled than the mean interval between the page faults is also doubled(and hence p.f rate will be reduced by half) so when earlier 30 sec were needed  by program instructions  and 30 sec of p.f

now program execution takes 30 sec  and  p.f will take 15 sec(15000x2000x10^-3)/2

so total time =30+15 =45 sec
answered by Veteran (50.7k points)
+1 vote
60sec = n * 1us + 15000*2000us

So n= (60-30)sec/1us = 30*10^ 6 instructions

Now memory is doubled...so page faults will be half...15000/2 = 7500

So = 30 * 10^6 * 1us + 7500 *2000us

= 30sec + 15 sec

= 45 sec
answered by Boss (25.9k points)
Answer:

Related questions

Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true
49,408 questions
53,590 answers
185,812 comments
70,871 users