in Algorithms edited by
3 votes
3 votes
$T(n)=4 T(n/2) + n^2 \sqrt{2}$

I have solved this by back substitution .. and it forms equations of the form

$4k T(n/2k) + k n^2 \sqrt 2$

its giving time complexity as n2 + n2 log2n

the answer is Theta(n2.5).

i have two questions .. a) how can we get Theta(n2.5).

b) is n2 log2n Asymptotically faster than n2 ?
in Algorithms edited by


The real question is this: what are we actually talking about? 

1. The fastness of the "growth" of the graph: in that case, indeed O(n2) grows faster than O(n). 

2. The fastness of the algorithm: in that case, O(n) grows faster than O(n2). 

Both are contrary to each other. The faster the graph of the upper bound function grows, the slower the algorithm will be as it will take more computational steps. The slower the graph grows, like O(logn) or O(1), the faster the algorithm will be. 

If a function f(n) is a sum of functions, one of which grows faster than the others, then the faster growing one determines the order of f(n). Example: If f(n) = 10 log(n) + 5 (log(n))3 + 7 n + 3 n2 + 6 n3 , then f(n) = O(n3 ).



One way to say one algorithm is asymptotically more efficient than another is if there is some (problem-specific) input size such that for any larger input size the more efficient algorithm will take fewer "computational steps", usually by some abstract measure, e.g. number of comparisons.


This is what i was looking for , thanks a lot :)

Word faster in 'Asymptotically faster' is creating ambiguity here.

The more better word is "Asymptotically efficient" like algo of order 'n' is Asymptotically more efficient than algo with complexity of n2 .

An algorithm with slower growth rate in terms of i/p is better than algorithm with faster growth rate.

Narsimha , it was a nice explanation.


1 Answer

1 vote
1 vote

O(n^2) is a subset of O((n^2) * log(n)), and thus the first is "better", it is easy to see that since log(n) is an increasing function, by multiplying something with it, you get a "higher" function then the original (f(n) <= f(n) * log(n) for each increasing non negative f and n>2).

Source: StackOverflow

Related questions