Chapter 2 - Algorithm and Algorithm
Chapter 2 - Algorithm and Algorithm
(CoSc2083– 5ECTS)
Sem. I - 2021
Department of Computer Science
Institute of Technology
Ambo University
Chapter 2 :
Algorithm and Algorithm
Analysis
Produce outputs:
Example: Display student’s information
Finite: Infinite:
int i=0; int i=0;
while(i<10) while(true)
{ {
cout << i ; cout << "Hello World";
i++;
}
}
Absence of ambiguity(Definiteness): the algorithm should have one and
only one interpretation during execution, Each step must be clearly
defined. At each point in computation, one should be able to tell exactly
what happens next
Example:2
if(5>7)
{
cout<<“hello”; // not executed because it is always false.
}
Properties of algorithms
Correctness: It must compute correct answer for all possible
legal inputs. The output should be as expected and required
and correct.
Simplicity: A good general rule is that each step should carry out one
logical step. What is simple to one processor may not be simple to another.
Note: Running time is the most important since computational time is the most
precious resource in most problem domains.
There are two approaches to measure the efficiency of algorithms:
1.Empirical
2. Theoretical
1. Empirical (Computational) Analysis
here the total running time of the program is considered. It
uses actual system clock time.
T(n)=1+1+
T(n)=1+(1+n+1+n+5n)+1
=7n+4=O(n) 8). (n+1)+n+n*(1+
(n+1)+n+n)
=3+2n+n2+2n+2n2 =3+
2n+3n2+2n
=3n2+4n+3=O(n2)
Formal Approach to Analysis
Simple Loops: There is 1 addition ConsecutiveStatements: Formally
per iteration of the loop, hence n , add the running times of the
additions in total. separate blocks of your code.
Example:
While sorting, if the list is in opposite order.
While searching, if the desired item is located at the last position or is
missing.
Asymptotic Notations
Asymptotic Analysis is concerned with how the running time of an algorithm
increases with the size of the input in the limit, as the size of the input increases
without bound!
Therefore, f(n) is O(n2), (c=8, k=1), there exist two constants that satisfy
the constraints.
…
Big-Omega (Ω)-Notation (Lower bound)
Definition: We write f(n)= Ω(g(n)) if there are positive constants no and c such
that to the right of no the value of f(n) always lies on or above c.g(n).
As n increases f(n) grows no slower than g(n). It describes the best case
analysis. Used to represent the amount of time the algorithm takes on the
smallest possible set of inputs-“Best case”.
Example: Find g(n) such that f(n) = Ω(g(n)) for f(n)=3n+5 g(n) = √n, c=1,
k=1. f(n)=3n+5=Ω(√n)
Big-Omega (Ω)-Notation (Lower bound)
…
Theta Notation (θ-Notation) (Optimal bound)
Definition: We say f(n)= θ(g(n)) if there exist positive constants no, c 1 and c2 such that to
the right of no, the value of f(n) always lies between c 1.g(n) and c2.g(n) inclusive, i.e.,
c1.g(n)<=f(n)<=c2.g(n), for all n>=no.
As n increases f(n) grows as fast as g(n). It describes the average case analysis. To
represent the amount of time the algorithm takes on an average set of inputs- “Average
case”.
Example: Find g(n) such that f(n) = Θ(g(n)) for f(n)=2n 2+3 ==> n2 ≤ 2n2 ≤
3n2 ==> c1=1, c2=3 and no=1 ==>f(n) = Θ(g(n)). Theta Notation (Θ-
Notation) (Optimal bound)
…
Little-oh (small-oh) Notation
Definition: We say f(n)=o(g(n)), if there are positive constants no and c such that to the right of no, the
value of f(n) lies below c.g(n).
As n increases, g(n) grows strictly faster than f(n). It describes the worst case analysis. Denotes an upper
bound that is not asymptotically tight. Big O-Notation denotes an upper bound that may or may not be
asymptotically tight.
Example: Find g(n) such that f(n) = o(g(n)) for f(n) = n2
n2<2n2, for all n>1, ==> k=1, c=2,
g(n)=n2
n2< n3, g(n) = n3, f(n)=o(n3)
n2< n4 , g(n) =n4 , f(n)=o(n4)
We can always determine the relative growth rates of two functions f(n) and g(n)
by computing lim n->infinity f(n)/g(n). The limit can have four possible values.
The limit is 0: This means that f(n)=o(g(n)).
The limit is c≠0: This means that f(n)=θ(g(n)).
The limit is infinity: This means that g(n)=o(f(n)).
The limit oscillates: This means that there is no relation between f(n) and g(n).
Example:
n3 grows faster than n2, so we can say that n2=O(n3) or n3=Ω(n2).
f(n)=n2 and g(n)=2n2 grow at the same rate, so both f(n)=O(g(n)) and f(n)=Ω(g(n)) are true.
If f(n)=2n2, f(n)=O(n4), f(n)=O(n3), and f(n)=O(n2) are all correct, but the last option is the best
answer.
…
Complexity Category
T(n) Big-O
functions F(n)
c, c is constant 1 C=O(1)
10logn + 5 logn T(n)=O(logn)
√n +2 √n T(n)=O(√n)
5n+3 n T(n)=O(n)
3nlogn+5n+2 nlogn T(n)=O(nlogn)
10n2 +nlogn+1 n2 T(n)=O(n2)
5n3 + 2n2 + 5 n3 T(n)=O(n3)
2n+n5+n+1 2n T(n)=O(2n)
7n!+2n+n2+1 n! T(n)=O(n!)
8nn+2n +n2 +3 nn T(n)=O(nn)
…
…
Arrangement of The order of the body statements
of a given algorithm is very
common functions by important in determining Big-Oh
growth
Function rate.
Name List of of the algorithm.
typical
c growth rates.
Constant
Example: Find Big-Oh of the
following algorithm.
log N Logarithmic 1)
for( int i=1;i<=n; i++)
log2 N Log-squared
sum=sum + i;
N Linear T(n)=2*n=2n=O(n).
N log N Log-Linear 2)
for(int i=1; i<=n; i++)
N2 Quadratic for(int j=1; j<=n; j++)
k++;
N3 Cubic
T(n)=1*n*n=n2 = O(n2).
2N Exponential
Chapter 3: Simple Sorting and
Searching Algorithms