retagged by
540 views

1 Answer

0 votes
0 votes

Big-Oh notation is a mathematical abstraction to characterize the runtime of algorithms. O(1) is one such mathematical abstraction to describe an algorithm or a computer program that takes the same amount of time to run irrespective of the input that is provided to it. That is, if we provide 50 items as input to the algorithm, it will take the same amount of time when we provide 50000 items as input. Thus, from an abstraction point of view, there can not be an algorithm which takes less than O(1) time. However, empirically, when you measure the runtime, you may get a different empirical value.

Related questions

1 votes
1 votes
1 answer
1
akash.dinkar12 asked Apr 11, 2017
430 views
0 votes
0 votes
1 answer
3
0 votes
0 votes
2 answers
4