Topic 1 - Introduction To Algorithm and Analysis
Topic 1 - Introduction To Algorithm and Analysis
In general this
is the notion
that we use to
characterize the
complexity of
algorithms
We perform
upper bound
analysis on
algorithms
Comparing Algorithms w.r.t. Complexity
• Consider the scenario where algorithm A1 and A2 has complexities:
CA1(n) = 0.5n2, CA2(n) = 5n
• Which one has the larger complexity?
f(x) is O(g(x))
k
Examples:
• Is 17 n2 – 5 = O(n2)?
17n 2 5 17n 2 n 1
(c 17, n0 1)
17n 2 5 O(n 2 )
• Is 35 n3 + 100 = O(n3)?
35n 3 100 36n 3 n 5
(c 36, n0 5)
35n 3 100 O(n 3 )
• Is 6 ∙ 2n + n2 = O(2n)?
6 2n n2 7 2n n 5
(c 7, n0 5)
6 2 n n 2 O(2 n )
Theorem:
Let f ( x) an x n an 1 x n 1 a1 x a0
where a0 , a1 ,, an 1 , an are real numbers.
Then
n
f ( x) is O( x )
Proof of the Theorem
| f ( x) | | an x n an 1 x n 1 a1 x a0 |
| an | x n | an 1 | x n 1 | a1 | x | a0 |
x n (| an | | an 1 | / x | a1 | / x n 1 | a0 | / x n )
x n (| an | | an 1 | | a1 | | a0 |)
Therefore
| f ( x) | Cx n
When
C | an | | an 1 | | a1 | | a0 |; x 1
Therefore
f ( x) is O( x n ) C | an | | an 1 | | a1 | | a0 |; k 1
Estimating Function Growth: Example
1 2 n n n n n2
So
2
(1 2 n ) is O(n ) C 1; k 1
Estimating Function Growth: Example
n! 1 2 3 n
n n n n
nn
So
n! is O(n n ), C 1; K 1.
Taking the log
log n! log n n n log n
So
log n! is O(n log n), C 1; K 1.
Growth Rates: Popular Complexity Classes
n! 2n n3
n2
n log n
n (linear time)
log n
n
Figure 2. Growth rates of some important complexity classes
Growth Rates: Illustration
Assume that we have a machine that can execute 1,000,000 required
operations per second
Algorithm 1 Algorithm 2 Algorithm 3 Algorithm 4 Algorithm 5
| f ( n) |
c | g ( n ) |
f(n) = (g(n))
k
Figure: Illustrating Ω
Examples: •Is 3n 2 ( n) ?
3n 2 3 n n 1
(c 3, n0 1)
3n 2 ( n)
•Is 3n 2 (1) ?
3n 2 1 1 n 1
(c 1, n0 1)
3n 2 (1)
value of n0
•Is 6 2 n n 2 (n100 )?
6 2 n n 2 ? n100 n ?
When n is bigger, 2n will grow faster than n100. ( Yes, you can find n0 )
6 2 n n 2 (n100 )
Big-Θ Notation
Definition: f ( n) ( g ( n)) if and only if three positive constants, c1, c2, n0 ,
| f ( n) |
f(n) = Θ(g(n))
c1 | g (n) |
n0 Figure: Illustrating Θ
Examples:
• Is 3n + 2 = (n)?
3n 3n 2 4 n n 2
(c1 3, c 2 4, n 0 2)
3n 2 ( n )
• Is 3n + 2 = (n2)?
2
3n 2 (n )
2
3n 2 (n )
Examples:
• Is 6 2 n n 2 ( 2 n ) ?
6 2n 6 2n n2 7 2n n 4
(c1 6, c 2 7, n 0 4)
6 2 n n 2 (2 n )
• Is 4n 3 3n 2 (n 2 ) ?
3 2 2
4n 3n O(n )
4n 3 3n 2 (n 2 )
Some Useful Rules
1) Transitive Property:
If f (n) O ( g (n)) and g(n) O(h(n)) then f ( n ) O ( h ( n ))
2) Reflexive Property:
f (n) O ( f (n))
f (n) ( f (n))
f (n) ( f (n))
Some Useful Rules
3) Symmetric:
• Example: n2 = o(n2log n)
Little-ω Notation
• Definition: For functions f and g from the set of integers to the set of
real numbers, we say f(n) = ω(g(n)) if
∀c>0, k∈ℕ so that ∀n>k, f(n) ≥ c·g(n)
• Example: n2 = ω(nlogn)
• Note: For functions f and g, f(n) = o(g(n)) if and only if g(n) = ω(f(n))
Sorting Problem
• Input: A sequence of n numbers: a1, a2, a3, …, an
• Output: A permutation (a' 1 , a' 2 , a' 3 , …, a' n ) of the input number
sequence a1, a2, a3, …, an such that a'1 ≤ a'2 ≤ a'3 ≤ … ≤ a'n
Insertion Sort
• Efficient for sorting small numbers (We shall see shortly)
• Loop Invariant: At the start of each iteration of the algorithm, the subarray a[0,
1, ..., j-1] contains the elements originally in a[0, 1, …, j-1], but in sorted order
Insertion Sort: Pseudo Code
Insertion-Sort(A)
1. For j ← 1 to (length(A) – 1) Do
2. key ← A[j]
//Insert A[j] into the sorted sequence A[0 ... j-1]
3. i←j–1
4. While i > 0 and A[i] > key Do
5. A[i + 1] ← A[i]
6. i←i–1
7. End of While Loop
8. A[i + 1] ← key //as A[i] ≤ key, so we place key on the right side of A[i]
9. End of For Loop
Insertion Sort: Correctness
• Initialization:
• Before the first loop starts, j=1.
• So, A[0] is an array of single element and so is trivially sorted
• Maintenance:
• The outer for loop has its index moving like j=1, 2, …, n-1 (if A has n elements)
• At the beginning of the for loop assume that the array is sorted from A[0, ..., j-1]
• The inner while loop of the jth iteration places A[j] at its correct position
• Thus at the end of the jth iteration, the array is sorted from A[0, ..., j]
• Thus, the invariance is maintained
• Then j becomes j+1.
• Also, using the same inductive reasoning the elements are also the same as in the
original array in the locations A[0, …, j].
Insertion Sort: Correctness
• Termination:
• The for loop terminates when j = n
• Thus, by the previous observations the array is sorted from A[0, …, n-1] and
the elements are also the same as in the original array.
�=�
(�� − �)
6. i←i–1 c6
�−�
(�� − �)
�=�
=(c1 c2 c3 c4 c7 ) n ( c2 c3 c4 c7 )
The best case run time of Insertion Sort is thus a linear function of n
Insertion Sort: Worst Case Run Time
• Consider the case when the input array A[0, …, n-1] is reverse-sorted
• For every value of j = 0 to n – 1:
• tj = ???
• While loop at step 4 would find A[i] ≥ key at every iteration
• The loop terminate after comparing key with all the elements of A[0, …, j-1]
• So, clearly tj = j for all j = 0 to n – 1
n-1 n-1 n-1
T(n)=c1n c2 (n 1) c3 (n 1) c4 j c5 (j 1) c6 (j 1) c7 (n-1)
j=1 j=1 j=1
c 4 c5 c6 2 c 3c 3c
=( )n (c1 c2 c3 4 5 6 )n (c5 c6 c2 c3 c7 )
2 2 2 2 2 2
The worst case run time of Insertion Sort is thus a quadratic function of n
Insertion Sort: Average Case Run Time
• Instead of an input of a particular type, all the inputs of the given
size are equally probable here
• For every value of j = 0 to n – 1:
• tj = ???
• We may assume that elements in the array A[0, ..., j-1] are randomly chosen
• Then, on an average, half the elements are greater than key while half are less
• Hence, on an average, tj = j/2 for all j = 0 to n – 1
�−1 �−1 �−1
� � �
�(�) = �1 � + �2 (� − 1) + �3 (� − 1) + �4 2 + �5 2 − 1 + �6 2 − 1 + �7 (� − 1)
�=1 �=1 �=1
�4 �5 �6 2 �4 3�5 3�6
= + + � + �1 + �2 + �3 − − − � + (�5 + �6 − �2 − �3 − �7 )
4 4 4 4 4 4
The average case run time of Insertion Sort is thus a quadratic function of n
Insertion Sort: In a Nutshell
Worst case: Input reverse sorted.
n
T ( n) ( j ) n 2 [arithmetic series]
j 2
Average case: All permutations equally likely.
n
T ( n) ( j / 2) n 2
j 2
Is insertion sort a fast sorting algorithm?
• Moderately so, for small n.
• Not at all, for large n.
Recursion
• Themes
• Recursion
• Recurrence Relations
• Recursive Definitions
• Induction (prove properties of recursive programs and objects defined
recursively)
• Examples
• Tower of Hanoi
• Merge Sort
• GCD
Recurrence Relations
• A Recurrence Relation for the sequence {a n } is an equation that
expresses a n is terms of one or more of the previous terms of the
s e q u e n c e , n a m e l y, a 0 , a 1 , … , a n - 1 , fo r a l l i n t e g e rs n w i t h
n ≥ n0, where n0 is a nonnegative integer
• A sequence {a n } is called a solution of a recurrence relation if its
terms satisfy the recurrence relation
• Note: A recurrence relation is like a recursively-defined sequence, but
without specifying any initial values (Initial Conditions)
• The same recurrence relation can have (and usually has) multiple solutions
based on its choice of initial condition(s)
• If both the initial conditions and the recurrence relation are specified, then
the sequence is uniquely determined
Tower of Hanoi
• Problem Statement:
• There are three towers
• 64 gold disks, with decreasing sizes, placed on the first tower
• You need to move all of the disks from the first tower to the second tower
• Larger disks can not be placed on top of smaller disks
• The third tower can be used to temporarily hold disks
• Challenge: Assume one disk can be moved in 1 second. Is this
possible to move all 64 disks in a week?
• To create an algorithm to solve this problem, it is convenient to
generalize it to an “N-disk” problem, where in our case N = 64
Tower of Hanoi: Recursive Solution
A B C
Tower of Hanoi: Recursive Solution
A B C
Tower of Hanoi: Recursive Solution
A B C
Tower of Hanoi: Recursive Solution
A B C
Tower of Hanoi: Recursive Algorithm
void Hanoi(int n, string a, string b, string c)
{
if (n == 1) /* base case */
Move(a, b);
else { /* recursion */
Hanoi(n-1, a, c, b);
Move(a, b);
Hanoi(n-1, c, b, a);
}
}
Tower of Hanoi: Recursive Solution for N=3
A B C
Tower of Hanoi: Recursive Solution for N=3
A B C
Tower of Hanoi: Recursive Solution for N=3
A B C
Tower of Hanoi: Recursive Solution for N=3
A B C
Tower of Hanoi: Recursive Solution for N=3
A B C
Tower of Hanoi: Recursive Solution for N=3
A B C
Tower of Hanoi: Recursive Solution for N=3
A B C
Tower of Hanoi: Recursive Solution for N=3
A B C
Tower of Hanoi: Correctness
• Iteration Method
• Substitution Method
• Master Method
Iteration Method
• Expand the recurrence iteratively
• Use the mathematical induction to find the boundary condition and show
that the guess is correct
T(n) = n + 2n + 4n + 8n + …
(log2 n times)
= n(1 + 2 + 4 + … log2 n times)
≤ n (2log n – 1)/(2 – 1) = n (n – 1)
Therefore, T(n) = O(n2)
Solving Recurrence Relation: Master Method
• The Master Method is based on Master Theorem stated as follows.
Master Method
• Master theorem applies only when the recurrence relation has the following
form:
T(n) = aT(n/b) + f(n) where a ≥ 1 and b > 1
• If f(n) = Ω(nc) where c > logba plus regularity condition holds, then T(n) = Θ(f(n))
Master Method
The master method is
mainly derived from
the recurrence tree
method
• If the work done at leaves is polynomially more, then leaves are the
dominant part, and our result becomes the work done at the leaf level
(Case 1)
• If work done at leaves and root is asymptotically the same, then our result
becomes height multiplied by work done at any level (Case 2)
• If work done at the root is asymptotically more, then our result becomes
work done at the root (Case 3).
Master Method: Examples
• Solve the following Recurrences using Master Method: