0% found this document useful (0 votes)
29 views

Algorithms and Complexity Lect 2

This document discusses time complexity of algorithms and how to calculate it. It provides examples of different time complexities like constant, linear, quadratic, and logarithmic. Time complexity is most commonly expressed using Big O notation, which measures how an algorithm's running time grows as the input size grows. Common time complexities include constant O(1), linear O(n), quadratic O(n^2), and logarithmic O(log n). Big O, Big Omega, and Big Theta notations are used to describe worst-case, best-case, and average-case time complexities.

Uploaded by

Arvin Hipolito
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

Algorithms and Complexity Lect 2

This document discusses time complexity of algorithms and how to calculate it. It provides examples of different time complexities like constant, linear, quadratic, and logarithmic. Time complexity is most commonly expressed using Big O notation, which measures how an algorithm's running time grows as the input size grows. Common time complexities include constant O(1), linear O(n), quadratic O(n^2), and logarithmic O(log n). Big O, Big Omega, and Big Theta notations are used to describe worst-case, best-case, and average-case time complexities.

Uploaded by

Arvin Hipolito
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 15

Time Complexity of

Algorithms
Time Complexity of Algorithms
For any defined problem, there can be N number of solution. This is true in
general. If you have a problem and you discuss about the problem with all of your
friends, they will all suggest different solutions. And you are the one to decide
which solution is the best based on the circumstances.

Similarly for any problem which must be solved using a program, there can be
infinite number of solutions. Let's take a simple example to understand this. Below
we have two different algorithms to find square of a number(for some time, forget
that square of any number n is n*n):
One solution to this problem can be, running a loop for n times,
starting with the number n and adding n to it, every time.
Example

/*

we have to calculate the square of n

*/

for i=1 to n

do n = n + n

// when the loop ends n will hold its square

return n
Or, we can simply use a mathematical operator * to find the square.

/*

we have to calculate the square of n

*/

return n*n
In the above two simple algorithms, you saw how a single problem can
have many solutions. While the first solution required a loop which will
execute for n number of times, the second solution used a mathematical
operator * to return the result in one line.

So which one is the better approach?


What is Time Complexity?

Time complexity of an algorithm signifies the total time required by the


program to run till its completion.

The time complexity of algorithms is most commonly expressed using the big O


notation. It's an asymptotic notation to represent the time complexity.
Time Complexity is most commonly estimated by counting the number of
elementary steps performed by any algorithm to finish execution. Like in the
example above, for the first code the loop will run n number of times, so the time
complexity will be n atleast and as the value of n will increase the time taken will
also increase. While for the second code, time complexity is constant, because it
will never be dependent on the value of n, it will always give the result in 1 step.
Calculating Time Complexity
Now the most common metric for calculating time complexity is Big O
notation. This removes all constant factors so that the running time can be estimated
in relation to N, as N approaches infinity. In general you can think of it like this :

statement;

Above we have a single statement. Its Time Complexity will be Constant. The
running time of the statement will not change in relation to N.
for(i=0; i < N; i++)

statement;

The time complexity for the above algorithm will be Linear. The running time of
the loop is directly proportional to N. When N doubles, so does the running time.
for(i=0; i < N; i++)

for(j=0; j < N;j++)

statement;

This time, the time complexity for the above code will be Quadratic. The running time
of the two loops is proportional to the square of N. When N doubles, the running time
increases by N * N.
while(low <= high)

mid = (low + high) / 2;

if (target < list[mid])

high = mid - 1;

else if (target > list[mid])

low = mid + 1;

else break;

This is an algorithm to break a set of numbers into halves, to search a particular field.
Now, this algorithm will have a Logarithmic Time Complexity. The running time of the
algorithm is proportional to the number of times N can be divided by 2(N is high-low
here). This is because the algorithm divides the working area in half with each iteration.
Types of Notations for Time Complexity

1. Big Oh denotes "fewer than or the same as" <expression> iterations.

2. Big Omega denotes "more than or the same as" <expression> iterations.

3. Big Theta denotes "the same as" <expression> iterations.

4. Little Oh denotes "fewer than" <expression> iterations.

5. Little Omega denotes "more than" <expression> iterations.


Understanding Notations of Time Complexity

1. O(expression) is the set of functions that grow slower than or at the same rate as expression. It
indicates the maximum required by an algorithm for all input values. It represents the worst case
of an algorithm's time complexity.

2. Omega(expression) is the set of functions that grow faster than or at the same rate as
expression. It indicates the minimum time required by an algorithm for all input values. It
represents the best case of an algorithm's time complexity.

3. Theta(expression) consist of all the functions that lie in both O(expression) and


Omega(expression). It indicates the average bound of an algorithm. It represents the average case
of an algorithm's time complexity.
Thank you
Prepared by:
Miss Marie Celia R. Aglibot
Instructor

You might also like