Have to find MAX of aj-ai in an array where j>=i+l in linear time.How to do it?My approach is to start j from l+1 to n and check min element every time is it a correct way to do it?
when j = l+1 we check only 1st elemrent and for j = l+2 next 2 element and so on
Is this approach is right?