What is the proper order in which we can arrange the sorting algos in terms of the number of swap operations they do ?

1 vote
389 views
I am not getting which algorithm has minimum no of swap operations , acc to me both selection and insertion should be at the lowest level as well as merge sort since after dividing the arrays we merely merge them where do we swap in that , please clarify this , I am a bit confused regarding counting the no of swap operations .

In the worst case answer should be selection sort. Because we use a temp variable for finding the maximum value in each iteration and do a swap only once for the inner loop. In this way we swap only $n$ times for the whole algorithm in any case.

Think of it like sorting a structure containing multiple objects. So, swap operation is heavy. So, here selection sort makes sense.

In insertion sort- for each element after finding its position we are shifting all elements to the right. These shifts can also be considered as swaps as they are swapping the positions.

For merge sort we are using an auxiliary array and we can assume all elements are swapped.

http://gatecse.in/wiki/Complexities_of_Basic_Sorting_Algorithms

selected
0

Sir  For merge sort we are using an auxiliary array and we can assume all elements are swapped.

copy & swap both are same ???

0
Sir , I agree to your explanation but I dont understand how come merge sort has swap operations since in it we are only copying elements into an auxilary array ,a bit confused in this .
4
"copy and swap are same"

No. That is why I told "Assume". Our aim is to minimize the exchange of data- and thus it involves minimizing the copying also. So, in this context copying is same as swap in terms of its bad effect.

Same with shifting of elements in insertion sort- technically it is not a swap.
0
sir if we have logn sorted lists each of size n/logn, what is the time complexity if we apply merge sort..I am bit confused.. can u plz explain???
1
If you have log n sorted lists , so now at leaf level (last level) u will hve log n lists each of size n/logn ,now when u will merge two lists at last level into 1 , u will get one list at second last level of size 2n/logn , Now likewise for 3rd last level u will get sorted list of size 4n/log n , so at each level u will get reduction in size of sorted list therefore work done at each level is of order(n) only since in total no of elements which are present are n and we are merging all n elements , and we have total no of levels as loglogn so total amount of work done is =no of levels *work done at each level =O(nloglogn)

becoz no of levels decrease logarithmically and we have log n sorted lists already present so it decreases by the number of sorted lists in logarthmic fashion and work done at each level means u r applying merge procedure for merging sorted lists so u are exploring all the n elements so total time complexity is O(nloglogn).

Related questions

1 vote
1
5k views
If the number of records to be sorted is small, then ...... sorting can be efficient. A. Merge B. Heap C. Insertion D. Bubble
Assume that a CPU can process $10^8$ operations per second. Suppose you have to sort an array with $10^6$ elements. Which of the following is true? 1. Insertion sort will always take more than 2.5 hours while merge sort will always take less than 1 second. ... will always take less than 1 second. 4. Insertion sort could take more than 2.5 hours while quicksort will always take less than 1 second.