Fundamentals of Analysis of Algorithm
Fundamentals of Analysis of Algorithm
OF ALGORITHM EFFICIENCY
Fundamentals of Analysis of Algorithm
• Analysis of Framework
• Measuring an input size
• Units for measuring runtime
• Worst case, Best case and Average case
• Asymptotic Notations
ANALYSIS FAMEWORK
• Efficiency of an algorithm can be in terms of
time or space.
• There is a systematic approach that has to be
applied for analyzing any given algorithm.
• This systematic approach is modelled by a
framework called as ANALYSIS FRAMEWORK.
Algorithm Analysis
Analysis of algorithm is the process of investigation
of an algorithm’s efficiency respect to two
resources:
– Running time
– Memory space
The reason for selecting these two criteria are
– Simplicity
– Generality
– Speed
– Memory
ANALYSIS OF ALGORITHMS
Efficiency
• Time efficiency or time complexity indicates
how fast an algorithm runs.
• Space Efficiency or space complexity is the
amount of memory units required by the
algorithm including the memory needed for
the i/p & o/p
Space complexity
• Space Complexity of an algorithm is total
space taken by the algorithm with respect to
the input size.
• To Compute the space complexity we use two
factors: Auxiliary Space, Input space
• Input space -> Constant -> space taken by
instruction, variable and identifiers
• Auxiliary Space – space required for
temporary usage while executing the
algorithm(stack, instructions)
Space complexity
• Constant Space Complexity
• Linear Space Complexity
• Quadratic Space Complexity
• Logarithmic Space Complexity
Space complexity
{
int z = a + b + c;
Return z;
}
• variables a, b, c and z - 4 bytes each,
• 4 bytes is for return value
• So (4(4) + 4) = 20 bytes,
• This space requirement is fixed, hence it is called Constant Space Complexity.
For(i=0;i<n;i++)
{
sum=sum+a[i];
}
Calculating sum of n numbers
Quadratic Time Complexity
Matrics addition :
For(i=0;i<n;i++)
{
for(j=0;j<n;j++)
{
c[i][j]=a[i][j]+b[i][j]
}
}
Matrics addition
Logarithmic Time Complexity
while(low <= high)
{
mid = (low + high) / 2;
if (target < list[mid])
high = mid - 1;
else if (target > list[mid])
low = mid + 1;
else
break;
}
Time space Trade-off
• Time space trade-off is basically a situation
where either a space efficiency can be
achieved at the cost of time or a time
efficiency can be achieved at the cost of
memory
Measuring an Input size
• Efficiency measure of an algorithm is directly
proportional to the input size or range
• There are two natural measures of size for
algorithm.
– The matrix order n
– The total number of elements N in the matrices being
multiplied.
• For ex: when multiplying two matrices, the
efficiency of an algorithm depends on the no. of
multiplication performed not on the order of
matrices.
Units for measuring Running time
The running time of an algorithm depends on:
• Dependence on the speed of a particular computer
• Dependence on the quality of a program implementing the
algorithm
• The compiler used in generating the machine code
• The difficulty of clocking the actual running time of the program.
To measure the algorithm efficiency:
• Identify the important operation(core logic) of an algorithm. This
operation is called basic operation
• So compute the no. of times the basic operation is executed will
give running time
• Basic operation mostly will be in inner loop, it is time consuming
Units for measuring Running time
Efficiency of an Algorithms
General Properties
Reflexive Properties
Transitive Properties
Symmetric Properties
Transpose Symmetric Properties
• General Properties :If f(n) is O(g(n)) then a*f(n) is
also O(g(n)) ; where a is a constant.
• Example: f(n) = 2n²+5 is O(n²)
then 7*f(n) = 7(2n²+5)
= 14n²+35 is also O(n²)
• Similarly this property satisfies for both Θ and Ω
notation.
We can say
If f(n) is Θ(g(n)) then a*f(n) is also Θ(g(n)) ; where
a is a constant.
If f(n) is Ω (g(n)) then a*f(n) is also Ω (g(n)) ;
where a is a constant.
• Reflexive Properties :If f(n) is given then f(n) is
O(f(n)).
• Example: f(n) = n² ; O(n²) i.e O(f(n))
• Similarly this property satisfies for both Θ and
Ω notation.
We can say
If f(n) is given then f(n) is Θ(f(n)).
If f(n) is given then f(n) is Ω (f(n)).
• Transitive Properties :If f(n) is O(g(n)) and g(n) is
O(h(n)) then f(n) = O(h(n)) .
• Example: if f(n) = n , g(n) = n² and h(n)=n³
n is O(n²) and n² is O(n³)
then n is O(n³)
• Similarly this property satisfies for both Θ and Ω
notation.
We can say
If f(n) is Θ(g(n)) and g(n) is Θ(h(n)) then f(n) = Θ
(h(n)) .
If f(n) is Ω (g(n)) and g(n) is Ω (h(n)) then f(n) = Ω
(h(n))
• Symmetric Properties :If f(n) is Θ(g(n)) then
g(n) is Θ(f(n)) .
• Example: f(n) = n² and g(n) = n²
then f(n) = Θ(n²) and g(n) = Θ(n²)
• This property only satisfies for Θ notation.
• Transpose Symmetric Properties :If f(n) is
O(g(n)) then g(n) is Ω (f(n)).
• Example: f(n) = n , g(n) = n²
then n is O(n²) and n² is Ω (n)
• This property only satisfies for O and Ω
notations.