Algorithms and Complexity Lect 2
Algorithms and Complexity Lect 2
Algorithms
Time Complexity of Algorithms
For any defined problem, there can be N number of solution. This is true in
general. If you have a problem and you discuss about the problem with all of your
friends, they will all suggest different solutions. And you are the one to decide
which solution is the best based on the circumstances.
Similarly for any problem which must be solved using a program, there can be
infinite number of solutions. Let's take a simple example to understand this. Below
we have two different algorithms to find square of a number(for some time, forget
that square of any number n is n*n):
One solution to this problem can be, running a loop for n times,
starting with the number n and adding n to it, every time.
Example
/*
*/
for i=1 to n
do n = n + n
return n
Or, we can simply use a mathematical operator * to find the square.
/*
*/
return n*n
In the above two simple algorithms, you saw how a single problem can
have many solutions. While the first solution required a loop which will
execute for n number of times, the second solution used a mathematical
operator * to return the result in one line.
statement;
Above we have a single statement. Its Time Complexity will be Constant. The
running time of the statement will not change in relation to N.
for(i=0; i < N; i++)
statement;
The time complexity for the above algorithm will be Linear. The running time of
the loop is directly proportional to N. When N doubles, so does the running time.
for(i=0; i < N; i++)
statement;
This time, the time complexity for the above code will be Quadratic. The running time
of the two loops is proportional to the square of N. When N doubles, the running time
increases by N * N.
while(low <= high)
high = mid - 1;
low = mid + 1;
else break;
This is an algorithm to break a set of numbers into halves, to search a particular field.
Now, this algorithm will have a Logarithmic Time Complexity. The running time of the
algorithm is proportional to the number of times N can be divided by 2(N is high-low
here). This is because the algorithm divides the working area in half with each iteration.
Types of Notations for Time Complexity
1. O(expression) is the set of functions that grow slower than or at the same rate as expression. It
indicates the maximum required by an algorithm for all input values. It represents the worst case
of an algorithm's time complexity.
2. Omega(expression) is the set of functions that grow faster than or at the same rate as
expression. It indicates the minimum time required by an algorithm for all input values. It
represents the best case of an algorithm's time complexity.