Big O notation
Big-O notation represents the upper bound of the running time of an algorithm. It gives the worst case complexity of an algorithm. 1
O(g(n)) = { f(n): there exist positive constants c and n0
such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 }
It can also be said that g
is "at most" f.
-
https://www.programiz.com/dsa/asymptotic-notations ↩