design analysis of algorithm
design analysis of algorithm
OF INFORMATION TECHNOLOGY
ALGORITHM
CAT 1
Big Oh (O) notation is used to describe an upper bound on the time complexity or space
complexity of an algorithm. It represents the worst-case scenario, where the growth rate of
the algorithm’s running time (or space) is at most proportional to the given function, as the
input size grows indefinitely.
Little Omega (ω) notation is used to describe a lower bound on the time complexity of an
algorithm. It signifies that for sufficiently large input sizes, the running time will always
exceed a certain threshold. It gives a strict lower bound, meaning the algorithm will always
take more time than the given function in the best-case scenario.
iii) Recurrence
Refers to a way of describing the running time of a recursive algorithm. It expresses the
overall running time of an algorithm as a function that depends on the size of the input,
typically broken into smaller sub problems.
Amortized Analysis is a technique for analysing the average running time of an algorithm
over a sequence of operations. Rather than looking at the worst-case time for a single
operation, amortized analysis considers the total cost over a series of operations and averages
the cost per operation.
v) Complexity of an Algorithm
This refers to a divide and conquers approach where the problem is divided into sub
problems in such a way that the solutions to the sub problems do not need to be merged
afterward. A classic example is binary search, where the array is divided into two parts, and
only one half is searched. Since the other half is discarded and no merging is needed, this
eliminates the need to combine the results of the two sub problems.In binary search, the
algorithm splits the input in half and only searches one half based on a comparison, without
the need to recombine or merge results.
3. Stack: Push and Pop Operations
A stack is a linear data structure that follows the Last In, First Out (LIFO) principle.
Elements are added and removed from the same end, called the top of the stack.
Push Operation: This operation adds an element to the top of the stack.
o Example: If the stack is [1, 2, 3], and you push 4, the stack becomes [1,
2, 3, 4].
Pop Operation: This operation removes the element from the top of the stack.
o Example: If the stack is [1, 2, 3, 4] and you pop, the stack becomes [1,
2, 3].
5. Backtracking
Worst Case Complexity: The maximum time or space an algorithm will take to solve
a problem, regardless of the input. It is usually represented using Big O notation.
o Example: For QuickSort, the worst case occurs when the pivot is always the
largest or smallest element, giving O(n2)O(n^2)O(n2).
Best Case Complexity: The minimum time or space an algorithm will take to solve a
problem. It shows the most favorable input scenario.
o Example: For QuickSort, the best case occurs when the pivot divides the
array into two equal halves each time, giving O(nlogn)O(n \log n)O(nlogn).
Average Case Complexity: The expected time or space an algorithm takes on
average, considering all possible inputs.
o Example: For QuickSort, the average case complexity is O(nlogn)O(n \log
n)O(nlogn).
7. Directed Acyclic Graph (DAG) and Principle of Optimality
Directed Acyclic Graph (DAG): A graph with directed edges where no cycles exist.
This structure is used in many algorithms, such as topological sorting and dynamic
programming.
Principle of Optimality: In dynamic programming, it states that an optimal solution
to a problem can be constructed from optimal solutions of its sub problems. It forms
the basis for solving problems by breaking them into overlapping sub problems.
Prim’s Algorithm finds a Minimum Spanning Tree (MST) for a connected, weighted,
undirected graph. It starts from a single node and grows the MST one edge at a time, always
choosing the minimum weight edge that connects a vertex inside the MST to a vertex outside.
Steps:
Quicksort is generally preferred for large data sets due to its average-case time
complexity of O(nlogn)O(n \log n)O(nlogn), despite its worst case
O(n2)O(n^2)O(n2). It is faster than Insertion Sort O(n2)O(n^2)O(n2) and Heap Sort
O(nlogn)O(n \log n)O(nlogn) for most practical cases due to better cache performance
and lower constant factors.
11. Greedy Method Control Abstraction for Subset Paradigm
The greedy method selects a local optimal solution at each stage with the hope of finding the
global optimum. A common abstraction:
vbnet
Copy code
function GreedyMethod(X):
solution = empty set
while feasible(solution, X):
select next best option from X
add to solution
return solution
State Space Tree: A tree representation of all the possible states that can be reached
in a problem.
Problem State: A specific configuration of the problem at a given moment in time.
Graph Coloring assigns colors to the vertices of a graph so that no two adjacent vertices
share the same color.
Strategy:
An adjacency matrix is a 2D array used to represent a graph. If a graph has nnn vertices, the
adjacency matrix is an n×nn \times nn×n matrix where an entry a[i][j]a[i][j]a[i][j] is 1 if there
is an edge between vertex iii and vertex jjj, and 0 otherwise.
A Hamiltonian Cycle is a cycle in a graph that visits each vertex exactly once and returns to
the starting vertex.
Algorithm: