0% found this document useful (0 votes)
2 views4 pages

Analysis of Algorithm

Chapter 1 discusses the analysis of algorithms, defining an algorithm as a finite set of instructions for solving computational problems. It covers the characteristics, complexity, and performance analysis methods of algorithms, including time and space complexity, as well as asymptotic analysis and notations. Additionally, various algorithm design techniques are introduced, such as brute force, divide and conquer, and dynamic programming.

Uploaded by

Asme
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
0% found this document useful (0 votes)
2 views4 pages

Analysis of Algorithm

Chapter 1 discusses the analysis of algorithms, defining an algorithm as a finite set of instructions for solving computational problems. It covers the characteristics, complexity, and performance analysis methods of algorithms, including time and space complexity, as well as asymptotic analysis and notations. Additionally, various algorithm design techniques are introduced, such as brute force, divide and conquer, and dynamic programming.

Uploaded by

Asme
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 4

Chapter 1

Analysis of Algorithm

Algorithm: computational procedure that consists of a finite set of instructions by which, given an
input from some set of possible inputs, enable us to obtaining ab output through a systematic execution
of instructions.
An algorithm is a sequence of unambiguous instructions
for solving a problem, i.e., for obtaining a required output
for any legitimate input in a finite amount of time.
An algorithm can be implemented in more than one
programming language.
A problem can be solved in more than one ways. Hence,
many solution algorithms can be derived for a given problem.
Given computational problem, an algorithm describes a specific computational procedure for achieving
that input/output relationship.
Example of algorithms:
• Sorting – bubble sorting, insertion sorting, etc
• Internet algorithms – routing algorithms, searching algorithms
• Security algorithms – public key cryptography
• Optimization algorithms – resources allocation
Expressing algorithms:
• flowchart
• pseudo code
How do we choose between the different sorting algorithms?
• We need to analyze them and choose the one with a better performance.
Analysis of Algorithms
Process of defining the performance of a program or algorithm
• Performance: amount of computer memory and time needed to run a program of given algorithm
• There are two methods to determine the performance of a program
◦ A Priori Analysis − This is a theoretical analysis of an algorithm that is performed before
implementation. Efficiency of an algorithm is measured by assuming that all other factors, for
example, processor speed, are constant and have no effect on the implementation.
◦ A Posterior Analysis − This is an empirical analysis of an algorithm that is performed after
implementation. The selected algorithm is implemented using programming language. This is
then executed on target computer machine. In this analysis, actual statistics like running time
and space required, are collected.
Characteristics of an Algorithm
• Unambiguous – Algorithm should be clear and unambiguous. Each of its steps (or phases), and
their inputs/outputs should be clear and must lead to only one meaning.
• Input − An algorithm should have 0 or more well-defined inputs.
• Output − An algorithm should have 1 or more well-defined outputs, and should match the
desired output.
• Finiteness − Algorithms must terminate after a finite number of steps.
• Feasibility − Should be feasible with the available resources.
• Independent − An algorithm should have step-by-step directions, which should be independent
of any programming code.
Algorithm Complexity
Suppose X is an algorithm and n is the size of input data, the time and space used by the algorithm X
are the two main factors, which decide the efficiency of X.
• Time Factor − Time is measured by counting the number of key operations such as
comparisons in the algorithm.
• Space Factor − Space is measured by counting the maximum memory space required by the
algorithm.
The complexity of an algorithm f(n) gives the running time and/or the storage space required by the
algorithm in terms of n as the size of input data.
Space complexity
Space complexity of an algorithm represents the amount of memory space required by the algorithm in
its life cycle. The space required by an algorithm is equal to the sum of the following two components
• A fixed part that is a space required to store certain data and variables, that are independent of
the size of the problem. For example, simple variables and constants used, program size, etc.
• A variable part is a space required by variables, whose size depends on the size of the problem.
For example, dynamic memory allocation, recursion stack space, etc.
Space complexity S(P) of any algorithm P is S(P) = C + SP(I), where C is the fixed part and S(I) is the
variable part of the algorithm, which depends on instance characteristic I. Example:
We have three variables, A, B, C and one constant, 10.
S(P) = C + SP(I) = 1 + 3 = 4.
Now, space depends on data types of given variables and
constant types and it will be multiplied accordingly.
Space complexity of a program – the amount of memory it needs to run to completion. A problem may
have several possible solutions with differing space requirements.
More generically, space needed by a program has the following components
a. Instruction space – space needed to store the compiled version of the a program instructions
b. Data space – space needed to store all constants and variables values
Two components
• space needed by constants and variables
• space needed by competent variables such as array, structure, and dynamically allocated
memory
c. environment stack space – stack is used to store information needed to resume execution of partially
completed functions.
Time Complexity
Time complexity of an algorithm represents the amount of time required by the algorithm to run to
completion. Time requirements can be defined as a numerical function T(n), where T(n) can be
measured as the number of steps, provided each step consumes constant time.
For example, addition of two n-bit integers takes n steps. Consequently, the total computational time is
T(n) = c ∗ n, where c is the time taken for the addition of two bits. Here, we observe that T(n) grows
linearly as the input size increases.
ASYMPTOTIC ANALYSIS
Asymptotic analysis of an algorithm refers to defining the mathematical framing of its run-time
performance. Using asymptotic analysis, we can very well conclude the best case, average case, and
worst case scenario of an algorithm.
Usually, the time required by an algorithm falls under three types
• Best Case − Minimum time required for program execution.
• Average Case − Average time required for program execution.
• Worst Case − Maximum time required for program execution.
Asymptotic Notations
Following are the commonly used asymptotic notations to calculate the running time complexity of an
algorithm.
• Ο Notation
• Ω Notation
• θ Notation
Big Oh Notation, Ο
The notation Οn is the formal way to express the upper bound of an
algorithm's running time. It measures the worst case time complexity
or the longest amount of time an algorithm can possibly take to
complete.

Omega Notation, Ω
The notation Ωn is the formal way to express the lower bound of an
algorithm's running time. It measures the best case time complexity or the
best amount of time an algorithm can possibly take to complete.

Theta Notation, θ
The notation θn is the formal way to express both the lower bound and
the upper bound of an algorithm's running time.

Algorithm Design Techniques/Strategies


• Brute force
• Divide and conquer
• Decrease and conquer
• Transform and conquer
• Greedy approach
• Dynamic programming
• Backtracking and branch-and-bound
• Space and time trade-offs
• Iterative improvement
• Branch and bound

You might also like