The real question is this: what are we actually talking about?

1. The fastness of the "growth" of the graph: in that case, indeed O(n^{2}) grows faster than O(n).

2. The fastness of the algorithm: in that case, O(n) grows faster than O(n^{2}).

Both are contrary to each other. The faster the graph of the upper bound function grows, the slower the algorithm will be as it will take more computational steps. The slower the graph grows, like O(logn) or O(1), the faster the algorithm will be.

If a function f(n) is a sum of functions, one of which grows faster than the others, then the faster growing one determines the order of f(n). Example: If f(n) = 10 log(n) + 5 (log(n))

^{3 }+ 7 n + 3 n^{2}+ 6 n^{3}, then f(n) = O(n^{3}).

[Source: http://web.mit.edu/16.070/www/lecture/big_o.pdf]

But,

One way to say one algorithm is asymptotically more efficient than another is if there is some (problem-specific) input size such that for any larger input size the more efficient algorithm will take fewer "computational steps", usually by some abstract measure, e.g. number of comparisons.