SHEN’s CLASS NOTES
We wish to design an algorithm whose complexity has lower order.
The Growth of Functions (O, ?, ? notations)
We introduce some notations used to compare growth rate between two complexity functions. We assume all complexity functions have positive values.
Definition 1 Let f(n) and g(n) be two functions from the set of non-negative integers to the set of real numbers. We say that f(n) is O(g(n)) (often written as f(n) = O(g(n)) if there are positive constant c and n0 such that f(n) ≤ cg(n)
whenever n > n0.
Example 3.
n3 + 2n +5 = O(n3) (c = 10, n0 = 1), nlgn = O(n2)
(c = 1, n0 = 1).
11
SHEN’s CLASS NOTES
Definition 2 Let f(n) and g(n) be two functions from the set of non-negative integers to the set of real numbers. We say that f(n) is ?(g(n)) (often written as f(n) = ?(g(n)) if there are positive constant c and n0 such that f(n) ≥ cg(n) whenever n > n0. Example 4.
n3 + 2n +5 = ? (n3) (c = 1, n0 = 1), n2 = ? (nlgn) .
Definition 3 Let f(n) and g(n) be two functions from the set of non-negative integers to the set of real numbers. We say that f(n) is ?(g(n)) (often written as f(n) = ?(g(n)) if and only if f(n) = O(g(n)) and f(n) = ?(g(n)).
(c = 1, n0 = 1),
Example 5.
n3 + 2n +5 = ? (n3).
12
SHEN’s CLASS NOTES
Definition 4 Let f(n) and g(n) be two functions from the set of non-negative integers to the set of real numbers. We say that f(n) is ?(g(n)) (often written as f(n) = ?(g(n)) if for any positive number c > 0, there exists a constant n0 such that
Obviously, f(n) = ?(g(n)) means
f(n)lim?0. n??g(n)
f(n) ≤ cg(n) whenever n > n0.
Example 6. 200n = o(nlgn)
Definition 5 Let f(n) and g(n) be two functions from the set of non-negative integers to the set of real numbers. We say that f(n) is ?(g(n)) (often written as f(n) = ?(g(n)) if for any positive number c > 0, there exists a constant n0 such that f(n) ≥ cg(n) whenever n > n0.
13
SHEN’s CLASS NOTES
Obviously, f(n) = ?(g(n) means
n??
limf(n)g(n)??.
Note. ? - notation is very seldom used.
The following is a list of commonly used representative functions in the order of their growth-rates:
lglgn, lgn, n, nlglgn, nlgn, n(lgn)2, n2, n3, 2n, n!, nn.
How to analyze the complexity of an algorithm
As we have discussed earlier, we use the total number of operations used by an algorithm as the time complexity. If we consider the asymptotic complexity, we can further simplify our analysis. We only need to count the number times the major operation will be executed. The major operation is the one that is executed most often.
14
SHEN’s CLASS NOTES
Example 7.
Procedure Linear search (x, A[1..n]) Input:
number x and array A of n numbers
Output: index i if A[i] = x, 0 otherwise. 1 i ?1
2 while (i ≤ n and x ≠ A[i]) 3 5 6 7 End
The major operations in the procedure are the comparisons between x and elements of A. As we can see that the number of such comparisons is at most n. Therefore, the time complexity is T(n) = O(n).
do i ? i + 1 then return (i) else return (0)
4 if i ≤ n
15