While analyzing an algorithm
While analyzing an algorithm
Lets start with a simple example. Suppose you are given an array A and an
integer x and you have to find if x exists in array A.
Simple solution to this problem is traverse the whole array A and check if the
any element is equal to x.
for i : 1 to length of A
if A[i] is equal to x
return TRUE
return FALSE
As we can see that the total time depends on the length of the array A. If the
length of the array increases the time of execution will also increase.
Order of growth is how the time of execution depends on the length of the
input. In the above example, we can clearly see that the time of execution
linearly depends on the length of the array. Order of growth will help us to
compute the running time with ease. We will ignore the lower order terms,
since the lower order terms are relatively insignificant for large input. We use
different notation to describe limiting behavior of a function.
O-notation:
To denote asymptotic upper bound, we use O-notation. For a given
function g(n), we denote by O(g(n)) (pronounced “big-oh of g of n”) as the
set of functions:
O(g(n))= { f(n) : there exist positive constants c and n0 such that
0≤f(n)≤c∗g(n) for all n≥n0 }
The general step wise procedure for Big-O runtime analysis is as follows:
1. Figure out what the input is and what n represents.
2. Express the maximum number of operations, the algorithm performs in
terms of n.
3. Eliminate all excluding the highest order terms.
4. Remove all the constant factors.
Polynomial Function:
If f(n) = a0 + a1.n + a2.n2 + …… + am.nm, then O(f(n)) = O(nm).
If f(n) = f1(n) + f2(n) + …….+ fm(n) and fi(n)≤ fi+1(n) ∀ i=1, 2, ……., m,
Summation Function:
Logarithmic Function:
If f(n) = logan and g(n)=logbn, then O(f(n))=O(g(n)); all log functions
grow in the same manner in terms of Big-O.
Ω-notation:
To denote asymptotic lower bound, we use Ω-notation. For a given
function g(n), we denote by Ω(g(n)) (pronounced “big-omega of g of n”) as
the set of functions:
Θ-notation:
To denote asymptotic tight bound, we use Θ-notation. For a given
function g(n), we denote by Θ(g(n)) (pronounced “big-theta of g of n”) as the
set of functions:
Θ(g(n))= { f(n) : there exist positive constants c1,c2 and n0 such that
0≤c1∗g(n)≤f(n)≤c2∗g(n) for all n>n0 }
To compute O-notation we will ignore the lower order terms, since the lower
order terms are relatively insignificant for large input.
Let f(N)=2∗N2+3∗N+5
O(f(N))=O(2∗N2+3∗N+5)=O(N2)
1. int count = 0;
for (int i = 0; i < N; i++)
for (int j = 0; j < i; j++)
count++;
2. int count = 0;
for (int i = N; i > 0; i /= 2)
for (int j = 0; j < i; j++)
count++;
This is a tricky case. In the first look, it seems like the complexity
is O(N∗logN). N for the j′s loop and logN for i′s loop. But its wrong. Lets
see why.
The table below is to help you understand the growth of several common
time complexities, and thus help you judge if your algorithm is fast enough
(assuming the algorithm is correct).
Worst Accepted
Length of Input (N)
Algorithm
≤[10..11] O(N!),O(N6)
≤[15..18] O(2N∗N2)
≤[18..22] O(2N∗N)
≤100 O(N4)
≤400 O(N3)
≤2K O(N2∗logN)
≤10K O(N2)
≤1M O(N∗logN)
≤100M O(N),O(logN),O(1)
The fastest possible running time for any algorithm is O(1), commonly
referred to as Constant Running Time. In this case, the algorithm always
takes the same amount of time to execute, regardless of the input size. This is
the ideal runtime for an algorithm, but it’s rarely achievable.
In general for an algorithm, space efficiency and time efficiency reach at two
opposite ends and each point in between them has a certain time and space
efficiency. So, the more time efficiency you have, the less space efficiency
you have and vice versa.
For example, Merge Sort algorithm is exceedingly fast but requires a lot of
space to do the operations. On the other side, Bubble Sort is exceedingly slow
but requires the minimum space.
At the end of this topic, we can conclude that finding an algorithm that works
in less running time and also having less requirement of memory space can
make a huge difference in how well an algorithm performs.
Amortize Analysis