0% found this document useful (0 votes)
32 views

Algorithm Analysis

The document discusses algorithm analysis and provides information about: 1) Algorithms must be correct, efficient solutions to problems. Their time and space complexity is analyzed. 2) Common data structures like arrays, stacks, queues and linked lists are introduced. Standard operations on data like traversing, insertion and deletion are also discussed. 3) The time and space complexity of algorithms is measured using approaches like worst-case analysis and Big O notation. Common complexity classes like constant, logarithmic, linear, quadratic and exponential time are analyzed.

Uploaded by

Badnaam Raja
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views

Algorithm Analysis

The document discusses algorithm analysis and provides information about: 1) Algorithms must be correct, efficient solutions to problems. Their time and space complexity is analyzed. 2) Common data structures like arrays, stacks, queues and linked lists are introduced. Standard operations on data like traversing, insertion and deletion are also discussed. 3) The time and space complexity of algorithms is measured using approaches like worst-case analysis and Big O notation. Common complexity classes like constant, logarithmic, linear, quadratic and exponential time are analyzed.

Uploaded by

Badnaam Raja
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 37

Algorithm Analysis

Algorithm
• An algorithm is a set of instructions to be followed to
solve a problem.
– There can be more than one solution (more than one
algorithm) to solve a given problem.
– An algorithm can be implemented using different
programming languages on different platforms.
• An algorithm must be correct. It should correctly solve
the problem.
– e.g. For sorting, this means even if (1) the input is already
sorted, or (2) it contains repeated elements.
• Once we have a correct algorithm for a problem, we
have to determine the efficiency of that algorithm.
Algorithmic Performance
There are two aspects of algorithmic performance:
• Time
• Instructions take time.
• How fast does the algorithm perform?
• What affects its runtime? 
• Space
• Data structures take space
• What kind of data structures can be used?
• How does choice of data structure affect the runtime?
 We will focus on time:
– How to estimate the time required for an algorithm
– How to reduce the time required
Data Structures
• Introduction about different Data Structures
• Array
• Stack
• Queue
• Linked List
• Tree
• Graph
Standard Operations
• Operations to access and modify data
– Traversing
– Insertion
– Deletion
– Updation
– Sorting
– Searching
– Merging
Algorithms

• A set of rules for carrying out a calculation by


hand or with some machine.
• A finite sequence of instructions to perform any
computation.
• An Algorithm is a well defined computational
procedure that takes input and produces output.
• There are multiple algorithms to perform
different operations on each Data Structures.
Algorithms
• Is a sequence of precise instructions for solving
a problem in a finite amount of time.
• Properties of an effective algorithm:
– It must have finite number of inputs and at least
one output.
– All instructions must be precise and unambiguous
(definiteness)
– It must give correct solution in all cases
(correctness)
– It must terminate after a finite no. of steps. It must
eventually end (Finiteness)
– (Effectiveness)
Problem Solving Phase
• Define the problem
• Outline the solution
– Identify a suitable technique
– Write the algorithm
– Check the correctness of the algorithm
– Check the complexity of algorithm for
comparison
Proof of Correctness
• If the output of the algorithm gives correct
output for every input instances, then it is
correct.
• Proof of correctness is based on algorithm.
• Check with all possible test cases.
Analysis of Algorithm
• Two complexities to be considered:
– Time Complexity :- Number of computational
operations to be performed
– Space Complexity :- The memory space
occupied by the algorithm to store the required
data
Analysis of Algorithms
• When we analyze algorithms, we should employ
mathematical techniques that analyze algorithms
independently of specific implementations,
computers, or data.

• To analyze algorithms:
– First, we start to count the number of significant
operations in a particular solution to assess its
efficiency.
– Then, we will express the efficiency of algorithms
using growth functions.
The Execution Time of Algorithms
• Each operation in an algorithm (or a program) has a cost.
 Each operation takes a certain of time.

count = count + 1;  take a certain amount of time, but it is constant

A sequence of operations:

count = count + 1; Cost: c1


sum = sum + count; Cost: c2

 Total Cost = c1 + c2
The Execution Time of Algorithms (cont.)
Example: Simple If-Statement
Cost Times
if (n < 0) c1 1
absval = -n c2 1
else
absval = n; c3 1

Total Cost <= c1 + max(c2,c3)


Analysis of Algorithm
• Find the sum of two numbers
Int main()
{
int a, b, sum=0;
sum = a + b;
printf(“Sum is: %d”,sum);
}
Analysis of Algorithm
• Find the sum of two numbers
Int main()
{
int a, b, sum=0;
sum = a + b;
printf(“Sum is: %d”,sum);
}
The complexity of this algorithm will be in the
O(1) or constant complexity.
Analysis of Algorithm
• Write an algorithm to print the addition table.
• Find the number of operations to be performed.
int s=3,n=10,I;
for(i=1;i<=n;i++)
{
printf("%d + %d = %d" ,s,i,s+i);
printf("\n");
}
The complexity of this algorithm will be in the
O(?)
Analysis of Algorithm
• Write an algorithm to print the addition table.
• Find the number of operations to be performed.
int s=3,n=10,I;
for(i=1;i<=n;i++)
{
printf("%d + %d = %d" ,s,i,s+i);
printf("\n");
}
The complexity of this algorithm will be in the
O(n)
• Consider any basic problem and try to write the
algorithm.
• Find the sum of elements of an array
• Arr[10]
• Sum=0; 1
• For(i=0;i<n;i++) 1+n+n
• {
• Sum=sum+arr[i]; n
• }
• Return sum; 1
• 1+1+3n+1 =3+3n
• Consider any basic problem and try to write the
algorithm.
• Find the sum of elements of an array
• Arr[10]
• Sum=0; 1
• For(i=0;i<n;i++) 1+n+n
• {
• Sum=sum+arr[i]; n
• }
• Return sum; 1
• Complexity is in O(n)
The Execution Time of Algorithms (cont.)
Example: Simple Loop
Cost Times
i = 1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
i = i + 1; c4 n
sum = sum + i; c5 n
}

Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*c5


 The time required for this algorithm is proportional to n
Analysis of Algorithm
• Consider the given set of instructions and find the number
operations to carried out.
int main()
{
for (int i=0; i<2; i++)
{
for (int j=0; j<4; j++)
printf("%d, %d\n",i ,j);
}
}
The complexity of this algorithm will be in the O(?)
Analysis of Algorithm
• Consider the given set of instructions and find the number
operations to carried out.
int main()
{
for (int i=0; i<2; i++)
{
for (int j=0; j<4; j++)
printf("%d, %d\n",i ,j);
}
}
The complexity of this algorithm will be in the O(1)
The Execution Time of Algorithms (cont.)
Example: Nested Loop
Cost Times
i=1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
j=1; c4 n
while (j <= n) { c5 n*(n+1)
sum = sum + i; c6 n*n
j = j + 1; c7 n*n
}
i = i +1; c8 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*(n+1)*c5+n*n*c6+n*n*c7+n*c8
 The time required for this algorithm is proportional to n2
Algorithm Growth Rates
We measure an algorithm’s time requirement as a function of the
problem size.
 Problem size depends on the application: e.g. number of elements
in a list for a sorting algorithm, the number of disks for towers of
hanoi.
So, for instance, we say that (if the problem size is n)
 Algorithm A requires 5*n2 time units to solve a problem of size n.
 Algorithm B requires 7*n time units to solve a problem of size n.
The most important thing to learn is how quickly the algorithm’s
time requirement grows as a function of the problem size.
 Algorithm A requires time proportional to n2.
 Algorithm B requires time proportional to n.
An algorithm’s proportional time requirement is known as
growth rate.
We can compare the efficiency of two algorithms by comparing
their growth rates.
Common Growth Rates
Function Growth Rate Name
c Constant
log N Logarithmic
log2N Log-squared
N Linear
N log N
N2 Quadratic
N3 Cubic
2N Exponential
Order-of-Magnitude Analysis and Big
O Notation
If Algorithm A requires time proportional to f(n),
Algorithm A is said to be order f(n), and it is denoted as
O(f(n)).
The function f(n) is called the algorithm’s growth-rate
function.
Since the capital O is used in the notation, this notation
is called the Big O notation.
If Algorithm A requires time proportional to n2, it is
O(n2).
If Algorithm A requires time proportional to n, it is
O(n).
Growth-Rate Functions
O(1) Time requirement is constant, and it is independent of the problem’s size.
O(log2n Time requirement for a logarithmic algorithm increases slowly
as the problem size increases.
O(n) Time requirement for a linear algorithm increases directly with the size
of the problem.
O(n*log2n) Time requirement for a n*log2n algorithm increases more rapidly than
a linear algorithm.
O(n2) Time requirement for a quadratic algorithm increases rapidly with the
size of the problem.
O(n3) Time requirement for a cubic algorithm increases more rapidly with the
size of the problem than the time requirement for a quadratic algorithm.
O(2n) As the size of the problem increases, the time requirement for an
exponential algorithm increases too rapidly to be practical.
What to Analyze
An algorithm can require different times to solve different problems
of the same size.
 Eg. Searching an item in a list of n elements using sequential search.
 Cost: 1,2,...,n
Worst-Case Analysis –The maximum amount of time that an
algorithm require to solve a problem of size n.
 This gives an upper bound for the time complexity of an algorithm.
 Normally, we try to find worst-case behavior of an algorithm.
Best-Case Analysis –The minimum amount of time that an
algorithm require to solve a problem of size n.
 The best case behavior of an algorithm is NOT so useful.
Average-Case Analysis –The average amount of time that an
algorithm require to solve a problem of size n.
 Sometimes, it is difficult to find the average-case behavior of an
algorithm.
 We have to look at all possible data organizations of a given size n, and
their distribution probabilities of these organizations.
 Worst-case analysis is more common than average-case analysis.
Sequential Search
int sequentialSearch(const int a[], int item, int n){
for (int i = 0; i < n && a[i]!= item; i++);
if (i == n)
return –1;
return i;
}
Unsuccessful Search:  O(n)

Successful Search:
Best-Case: item is in the first location of the array O(1)
Worst-Case: item is in the last location of the array O(n)
Average-Case: The number of key comparisons 1, 2, ..., n
n

i ( n 2  n) / 2

i 1
 O(n)
n n
Binary Search
int binarySearch(int a[], int size, int x) {
int low =0;
int high = size –1;
int mid; // mid will be the index of
// target when it’s found.
while (low <= high) {
mid = (low + high)/2;
if (a[mid] < x)
low = mid + 1;
else if (a[mid] > x)
high = mid – 1;
else
return mid;
}
return –1;
}
Binary Search – Analysis
 For an unsuccessful search:
 The number of iterations in the loop is log2n + 1
 O(log2n)
 For a successful search:
 Best-Case: The number of iterations is 1.  O(1)
 Worst-Case: The number of iterations is log2n +1  O(log2n)
 Average-Case: The avg. # of iterations < log 2n  O(log2n)

0 1 2 3 4 5 6 7  an array with size 8


3 2 3 1 3 2 3 4  # of iterations
The average # of iterations = 21/8 < log 28
How much better is O(log2n)?
n O(log2n)
16 4
64 6
256 8
1024 (1KB) 10
16,384 14
131,072 17
262,144 18
524,288 19
1,048,576 (1MB) 20
1,073,741,824 (1GB) 30
Analysis of sorting algorithm

Insertion sort
Step 1 − If it is the first element, it is already sorted.
return 1;
Step 2 − Pick next element
Step 3 − Compare with all elements in the sorted sub-list
Step 4 − Shift all the elements in the sorted sub-list that
is greater than the value to be sorted
Step 5 − Insert the value
Step 6 − Repeat until list is sorted
Insertion sort(A)
For j=2 to A.length
Key = A[j]
i=j-1
While i>0 and A[i]>key
A[i+1] = A[i]
I = i-1
A[i+1] = key
Proof of correctness(Loop invariants method)
Initialization: It is true prior to the 1st iteration of loop
Maintenance: If it is true before an iteration of the
loop, it remains true before the next iteration
Termination: When the loop terminates, the invariant
gives us a property that helps show that the algorithm
is correct

You might also like