Best case: Best case time complexity is a time complexity when an algorithm runs for short time.
Worst case: Worst case time complexity is a time complexity when an algorithm runs for a longest time.
Average case: This type of analysis results in average running time over every type of input.
Complexity: Complexity refers to the rate at which the storage time grows as a function of the problem size
O(expression) is the set of functions that grow slower than or at the same rate as expression. It indicates the maximum required by an algorithm for all input values. It represents the worst case of an algorithm's time complexity.
The function f(n) = O(g(n)) iff(if and only if) there exist positive constants c and n0 such that f(n) ≤ c*g(n) for all n, n ≥ n0.
Examples:
1. 3n+1 = O(n)
as 3n+1 ≤ 4n for all n ≥ 1.
c = 4 & n0 = 1
2. 3n+4 = O(n)
 as 3n+4 ≤ 5n for all n ≥ 2.
c = 5 & n0 = 2
3. 10n2+4n+2 = O(n2)
 as 10n2+4n+2 ≤ 10n2 for all n ≥ 2.
c = 10 & n0 = 2
As illustrated by the previous examples, the statement f(n) = O(g(n)) states only that g(n) is an upper bound on the value of f(n) for all n, n ≥ no. It does not say anything about how good this bound is.
Order | Name | Examples |
O(1) | Constant | Few algorithms without any loops |
O(log n) | Logarthmic | Searching algorithms |
O(n) | Linear | Algorithms that scan a list of size n |
O(n log n) | n-log-n | Divide and conquer algorithms |
O(n2) | Quadratic | Operations on n by n matrices |
O(n3) | Cubic | Nontrivial algorithms from linear algebra |
О(2n) | Exponential | Algorithms that generate all subsets of a set |
O(n!) | Factorial | Algorithms that generate permutations of a set |
If algorithm is run on same complexity on same type of data, but with higher magnitude of n, the resulting time is less than some constant time f(n). Thus
O(1) < O(lon n) < O(n) < O(n log n) < O(n2) < O(2n)