Let's tie some pieces together. If we take the textbooks at word-value, then
Quicksort is often the best practical choice.
But Insertion sort beats quick sort for nearly sorted input.
Insertion sort is practically "optimal" only when the "range" of being nearly sorted is small.
(the max distance each element is at from being sorted; $k$ in this question)
Quoting from stackoverflow
As
Bob Sedgewick showed in his dissertation work (and follow-ons), insertion sort absolutely
crushes the "almost-sorted array". In this case your asymptotics look good but if k < 12 I bet insertion sort wins every time. I don't know that there's a good explanation for
why insertion sort does so well, but the place to look would be in one of Sedgewick's textbooks entitled
Algorithms (he has done many editions for different languages).
Insertion sort in case of large $k$ would give a time complexity of $O(k*n)$
When this $k$ is small, we can ignore it and say that the time complexity is $O(n)$
For a large $k$, we use this asymptotically optimal $O(nlogk)$ algorithm.
1) Build a min heap of $k+1$ elements.
2) Extract min. /*This is guaranteed to sort the first element of the array. Think.*/
3) Add the next element from the array to the heap.
Repeat 2) and 3).
Step 1) takes $O(k) \ time$
Step 2) takes $O(logk) \ time$
Step 2) takes $O(logk) \ time$ (heapify)
We repeat this for n elements, hence the total algorithm takes $O(k+nlogk+nlogk) = O(nlogk) \ time$
Final verdict
For a k-sorted array, when k is small, it takes linear time. $O(n)$ or $O(kn)$ via insertion sort.
When k is large, we can't multiply n by k in the overall time complexity, as this k becomes intolerant. We devise an algorithm in which we'd have to multiply n by logk, ie, $O(nlogk)$
Option D