2,003 views
0 votes
0 votes

Well, I am not sure that wether the Second chance algorithm suffers from Beladys anomaly. But after giving a thought i think that it may suffer from beladys anomaly because it can also sometimes work as FIFO exactly if reference bit becomes Zero for some set of Pages ??   Please Clarify :)

1 Answer

Best answer
3 votes
3 votes

Yes. Second Chance Page Replacement Algorithm suffers from Belady's Anamoly because Second-chance replacement degenerates to FIFO replacement if all bits(reference bits) are set or All bits(reference bits) are reset. 


Quick Revision :

Few computer systems provide sufficient hardware support for true $LRU$ page replacement. In fact, some systems provide no hardware support, and other page-replacement algorithms (such as a FIFO algorithm) must be used. Many systems provide some help, however, in the form of a reference bit.

1. Reference Bit : 

The reference bit for a page is set by the hardware whenever that page is referenced (either a read or a write to any byte in the page). Reference bits are associated with each entry in the page table. Initially, all bits are cleared (to 0) by the operating system. As a user process executes, the bit associated with each page referenced is set (to 1) by the hardware. After some time, we can determine which pages have been used and which have not been used by examining the reference bits, although we do not know the order of use. This information is the basis for many page-replacement algorithms that approximate LRU replacement..One of them is  Second-Chance Page Replacement Algorithm.

2. Second-Chance Page Replacement Algorithm : 

A modified form of the FIFO page replacement algorithm, known as the Second-chance page replacement algorithm, fares relatively better than FIFO at little cost for the improvement. It works by looking at the front of the queue as FIFO does, but instead of immediately paging out that page, it checks to see if its referenced bit is set. If it is not set, the page is swapped out. Otherwise, the referenced bit is cleared, the page is inserted at the back of the queue (as if it were a new page) and this process is repeated. This can also be thought of as a circular queue. If all the pages have their referenced bit set, on the second encounter of the first page in the list, that page will be swapped out, as it now has its referenced bit cleared. If all the pages have their reference bit cleared, then second chance algorithm degenerates into pure FIFO.

As its name suggests, Second-chance gives every page a "second-chance" – an old page that has been referenced is probably in use, and should not be swapped out over a new page that has not been referenced.

The basic algorithm of second-chance replacement is a FIFO replacement algorithm. When a page has been selected, however, we inspect its reference bit. If the value is 0, we proceed to replace this page; but if the reference bit is set to 1, we give the page a second chance and move on to select the next FIFO page. When a page gets a second chance, its reference bit is cleared, and its arrival time is reset to the current time. Thus, a page that is given a second chance will not be replaced until all other pages have been replaced (or given second chances). In addition, if a page is used often enough to keep its reference bit set, it will never be replaced.

selected by

Related questions

0 votes
0 votes
1 answer
1
sripo asked Dec 28, 2018
923 views
What is the reason for Belady’s Anomaly,I am aware that it is not a stack based algorithm and for a certain set of pages it shows this anomaly where the increase in pag...
0 votes
0 votes
0 answers
2
Mahendra Singh Kanya asked Dec 15, 2017
1,040 views
"if an algorithm has stack property then it never falls into Beladys Anomaly"Is their any simple proof to this? Also I wanna know if it's one way or two way implication. ...