Lecture 4 Class
Lecture 4 Class
Lecture No. 4
1
Insertion Sort - Analysis
We count the number of primitive operations executed by an algorithm.
3
Algorithm Analysis
What is a Good Algorithm?
Efficient:
• Running time
• Space used
How to measure efficiency?
Efficiency as a function of input size:
• The number of bits in an input number
• Number of data elements (numbers, points)
Asymptotic Notation
Asymptotic Notations allows us to analyze an
algorithm's running time by identifying its
behaviour as the input size of the algorithm
increases. This is also known as an algorithm's
growth rate.
We are interested in the upper bound on
running time of the algorithm, or a Worst Case
of an algorithm.
Asymptotic Notation for this is Big-O
Big O
Formal mathematical definition of Big O.
Let T(n) and f(n) be two positive functions.
We write T(n) = O(f(n)), and say that T(n) has order of
f(n) or T(n) is of order f(n),
if there are positive constants M and n₀ such that
T(n) ≤ M·f(n) for all n ≥ n₀.
Example
Suppose T(n) = 6n2 + 8n + 10
Then we know for n ≥ 1
6n2 ≤ 6n2
8n ≤ 8n2
& 10 ≤ 10n2
Therefore T(n) ≤ 24n2 for all n ≥ 1
& T(n) = O(n2)
In this case M = 24 & n₀ = 1
Note: M & n₀ are not unique
Can we find different M & n₀ for the above example?
M = 10 & n₀ = 4
We would say that the running time of this algorithm grows as n2
Sloppy Notation
The notation T(n) ∊ O(f(n)) can be used even when f(n)
grows much faster than T(n).
For example,
If T(n) = 6n2 + 8n + 10
We may write T(n) = O(n3).
This is indeed true, but not very useful.
Hierarchy of functions: log n < n < n2 < n3 < 2n
Caution!
Beware of very large constant factors. An algorithm
running in time 100000n is still O(n) but might be less
efficient than one running in time 2n2, which is O(n2)
Example
How do you show that 10n3 ≠ O(n2) ?
By Contradiction
Assume 10n3 = O(n2)
Then there are positive constants M and n₀ such
that 10 n3 ≤ M· n2 for all n ≥ n₀.
Therefore, 10n/M ≤ 1 for all n ≥ n₀.
Certainly not true for n Max {M, n₀}
Therefore our assumption was wrong.
Comparing Algorithms
Comparison between algorithms can be done easily.
Let's take an example to understand this:
If we have two algorithms with the following expressions representing
the time required by them for execution,
Expression 1: 20n2 + 3n - 4
Expression 2: n3 + 100n - 2
Then as per asymptotic notations, we should just worry about how the
function will grow as the value of n (input) will grow
And that will entirely depend on n2 for Expression 1
and on n3 for Expression 2.
Hence, we can clearly say that the algorithm for which running time is
represented by the Expression 2, will grow faster than the other one,
simply by analysing the highest power coefficient
Good & Bad Solutions
When analysing algorithms we will often come across the
following time complexities.
Complexity
Good News
O(1)
O(logn)
O(n)
O(nlogn)
________________________
O(nk), k ≥ 2
O(kn), k ≥ 2 Bad News
O(n!)
Big O
Theorem
How to Determine Complexities
Complexity
How to determine the running time of a piece of code?
•The answer is that it depends on what kinds of statements are used.
Sequence of statements
statement 1;
statement 2;
...
statement k;
The total time is found by adding the times for all statements:
total time = time(statement 1) + time(statement 2) + ... +
time(statement k)
If each statement is "simple" (only involves basic operations) then the
time for each statement is constant and the total time is also constant:
O(1)
How to Determine Complexities
Complexity
If-Then-Else
if (cond)
then block 1 (sequence of statements)
else
block 2 (sequence of statements)
end if;
Here, either block 1 will execute, or block 2 will execute.
Therefore, the worst-case time is the slower of the two
possibilities: max(time(block 1), time(block 2))
If block 1 takes O(1) and block 2 takes O(N), the if-then-else
statement would be O(N).
How to Determine Complexities
Complexity
Loops
for I in 1 .. N loop
sequence of statements
end loop;
• The loop executes N times, so the sequence of statements
also executes N times.
• If we assume the statements are O(1),
• then total time for the for loop is N * O(1), which is O(N)
overall
How to Determine Complexities
Complexity
Nested loops
for I in 1 .. N loop
for J in 1 .. M loop
sequence of statements
end loop;
end loop;
• The outer loop executes N times.
• Every time the outer loop executes, the inner loop executes M
times.
• As a result, the statements in the inner loop execute a total of N *
M times.
Big - Omega
The “big -Omega” - Ω
f(n) = Ω(g(n)) if there exists constants c and n0, such
that c g(n) ≤ f(n) for n n0
• used to give a lower bound.
Important to know how many
computations are necessary to
solve the problem
Equivalently any solution to the problem has to
perform that may computations
Generally not easy to prove.
Big - Theta
“The Big Theta” – θ
f(n) = θ(g(n)) if there exists constants c1, c2
and n0 such that c1g(n) ≤ f(n) ≤ c2g(n) for n n0
f(n) = θ(g(n)) if and only if
f(n) is O(g(n)) and f(n) is Ω(g(n))
• Asymptotically tight bound
Selection Sort
• Big Idea
Orders a list of values by repeatedly putting the
smallest unplaced value into its final position.
Algorithm:
– Look through the list to find the smallest value.
– Swap it so that it is at index 0.
– Look through the list to find the second-
smallest value.
– Swap it so that it is at index 1.
...
– Repeat until all values are in their proper
places.
How Selection Sort Works
Example:
Consider the following list to be sorted in ascending order using
selection sort : 8, 2, 14, 7, 6
Selection sort works as -
• find the first smallest element 2
• It swaps it with the first element of the unordered list.
New list looks like 2, 8, 14, 7, 6
sorted unsorted
This Step partitions the list into two parts:
left part which is sorted & the right part which is unsorted.
How Selection Sort Works
• Next, find the smallest element of unsorted list
• It swaps it with the first element of the unsorted list.
New list looks like 2, 6, 14, 7, 8
sorted unsorted
Again This Step partitions the list into two parts:
left part which is sorted & the right part which is unsorted
• Similarly, it continues to sort the given elements.
• Finally, sorted list in ascending order are-
2, 6, 7, 8, 14
How Selection Sort Works