The Gateway to Computer Science Excellence

+1 vote

1) Randomly picking up to make worst case less likely to occur. 2) Calling insertion sort for small sized arrays to reduce recursive calls. 3) QuickSort is tail recursive, so tail call optimizations can be done. 4) A linear time median searching algorithm is used to pick the median, so that the worst case time reduces to O(nLogn)

I am not getting points 2 and having confusion in point 3 that how quicksort can be tail-recursive since we have 2 function calls at the end , and why is option 4 wrong ,since if we pick the pivot as median then surely It will divide the array equally into two halves therefore worst case time complexity must be O(n log n ) , plz correct me where am I wrong ?

+2 votes

Best answer

these links may clear your doubt

http://www.geeksforgeeks.org/iterative-quick-sort/

http://stackoverflow.com/questions/12454866/how-to-optimize-quicksort

https://en.wikipedia.org/wiki/Quicksort#Optimizations

in 4th option we can use in theory but not in practice bcoz choose median as pivot in large array has many overhead ...

http://www.geeksforgeeks.org/iterative-quick-sort/

http://stackoverflow.com/questions/12454866/how-to-optimize-quicksort

https://en.wikipedia.org/wiki/Quicksort#Optimizations

in 4th option we can use in theory but not in practice bcoz choose median as pivot in large array has many overhead ...

- All categories
- General Aptitude 1.9k
- Engineering Mathematics 7.6k
- Digital Logic 2.9k
- Programming and DS 4.9k
- Algorithms 4.4k
- Theory of Computation 6.2k
- Compiler Design 2.1k
- Databases 4.1k
- CO and Architecture 3.4k
- Computer Networks 4.2k
- Non GATE 1.4k
- Others 1.5k
- Admissions 595
- Exam Queries 573
- Tier 1 Placement Questions 23
- Job Queries 72
- Projects 18

50,833 questions

57,709 answers

199,418 comments

107,623 users