0% found this document useful (0 votes)
50 views

Correctness and Complexity Analysis of Quick Sort

This document provides information about quicksort algorithm. It begins with an overview of quicksort, describing it as a popular sorting algorithm that utilizes a divide-and-conquer approach. It then details the quicksort algorithm, including pseudocode. Next, it discusses correctness, runtime analysis, and other properties such as stability, space complexity, and advantages/disadvantages compared to mergesort. Overall, the document presents a comprehensive overview of the quicksort sorting algorithm.

Uploaded by

DIVYA ARYA
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
50 views

Correctness and Complexity Analysis of Quick Sort

This document provides information about quicksort algorithm. It begins with an overview of quicksort, describing it as a popular sorting algorithm that utilizes a divide-and-conquer approach. It then details the quicksort algorithm, including pseudocode. Next, it discusses correctness, runtime analysis, and other properties such as stability, space complexity, and advantages/disadvantages compared to mergesort. Overall, the document presents a comprehensive overview of the quicksort sorting algorithm.

Uploaded by

DIVYA ARYA
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

ALGORITHMS

ASSIGNMENT
(a) Full Name : Divya Arya

(b) Class Roll Number : 18HCS4510

(c) Name of Course : B.Sc(H) Computer


Science.

(d) Semester : 4th Semester (IInd Year)

(e) Name of Subject : Design & Analysis of


Algorithms

(j) Submitted to : Dr. Rajan Gupta

(k)College/University : Deen Dayal


Upadhyaya College, University of Delhi.
Correctness and Complexity
Analysis of Quick Sort
QUICK SORT
Quicksort is a popular sorting algorithm
that is often faster in practice compared to
other sorting algorithms.
It utilizes a divide-and-conquer strategy to
quickly sort data items by dividing a large
array into two smaller arrays. It was
developed by Charles Antony Richard
Hoare (commonly known as C.A.R. Hoare
or Tony Hoare) in 1960 for a project on
machine translation for the National
Physical Laboratory.
QUICK SORT ALGORITHM
Quick_Sort( A , p , r )
1. If p  r
2. q Partition(A , p , r )
3. Quick_Sort( A , p , q-1 )
4. Quick_Sort( A , q + 1, r )

Partiton ( A , p ,r)
5. x  A[ r ]
6. i  p – 1
7. for j  p to r-1
8. if A [ j ] <= x
9. then i  i+1
10. swap( A [ i ] , A [ j ] )
11. swap(A [ i +1 ] , A [ r ]
12. return i + 1
LOOP INVARIANT

At the beginning of iteration j of the quick sort the following


invariant is true:
The first i elements are less than or equal to the pivot x
(window 1), the elements in indices i + 1, . . . , j − 1 are greater
than x (window 2), and we can not say anything about (whether
the value is smaller/larger than the pivot) the elements in the
indices j, . . . , n. During iteration j, the value at A[j] is compared
with x. If A[j] > x then window-2 increases by one. Otherwise,
the first element of window-2 is swapped with x and the
window-1 increases by one.
CORRECTNESS :

INITIALIZATION : Initially ‘i’ is p -1 and j = p and x = A [ r ] which is the


last element of array and also the initial pivot so the array is trivially
sorted.

MAINTAINANCE : When ‘x’ and ‘i’ are initialized iteration begins then
the two conditions are possible . Suppose A [ j ] > x , the value of ‘i’
will not be changed and ‘j’ becomes ‘j + 1‘ . Suppose after all iterations
that is till r-1 , A[ j ] >x so ‘i’ will not be changed and A [ j ] is also
unchanged . Suppose A[ j ] < = x then x and A [ i + 1 ] are swapped
and ‘i’ is incremented by 1 and the loop invariant holds .

TERMINATION : At the end pivot and A [ i + 1 ] are swapped and i +1


is returned , which is the correct position of the pivot .
RUNTIME
ANALYSIS
The recursion for quick sort
depends the size of recursive
sub problem generated at each
stage of the recursion. Since
the pivot can take any value,
the size of a sub problem can
take any value in the range
[0..n − 1]. i.e., T(n) = T(k) +
T(n − k − 1) + O(n), T(2) = 1
where k is the size of the
window-1 at the end of n
iterations. The size of the other
recursive problem is n−k−1
(total elements minus pivot and
window-1).
BEST CASE :
In best case, each subproblem is balanced with equal size
or nearly good split;
T(n) = T(n/2) + T(n/2 − 1) + O(n)
or
T(n) = T(n/2 − 2) + T(n/2 + 1) + O(n).

For unbalanced split the recurrence looks like:


T(n) = T(n/3 − 1) + T(2n/3) + O(n)
or
T(n) = T(n/6) + T(5n/6 − 1) + O(n) or
in general T(n) = T(α · n) + T((1 − α) · n) + O(n), α < 1.
For all the above recurrence, using recurrence tree, one
can show that T(n) = O(n log n). Therefore, the best case
input of quick sort takes O(n log n) to sort an array of n
elements
WORST CASE :

In worst case, the size of one recursive problem


is zero or a very small non-zero constant and
the other sub problem size is nearly n. i.e.,
T(n) = T(n − 1) + O(n) or T(n) = T(n − 4) +
T(3) + O(n)
or
In general T(n) = T(n − 1) + O(1) + O(n)
where l is a fixed integer.
Clearly, using substitution or recurrence tree
method, we get T(n) = θ(n 2 ).
An example for worst case input is any array
in ascending or descending order.
Quick sort - The
best sorting
algorithm ??
Even though quick-sort has a worst
case run time of Θ(n2), quicksort is
considered the best sorting because it
is very efficient on the average: it’s
expected running time is Θ(n logn)
where the constants are very small as
compared to other sorting algorithms.

But because it has the best


performance in the average case for
most inputs, Quicksort is generally
considered the “fastest” sorting
algorithm.
Although, run-time of quick sort is θ(n 2 ) in worst
case, for an arbitrary input the run-time is O(n log
n). Due to this reason, quick sort is a candidate
sorting algorithm in practice.

Quick sort can be made to run in O(n log n) for all


inputs (i.e. worst case time:
O(n log n)) if median element is chosen as a pivot
at each iteration and median can be found in O(n).
i.e., T(n) = 2T(n/2) + O(n) + O(n). The first O(n) is
for finding median in linear time and the second
O(n) is for the pivot subroutine. Further, since
median is the pivot, each iteration of quick sort
yields two sub problems of equal size.
STABILITY :
A sort algorithm which always preserves the order of
elements (which do not differ in the new sort key) is called
a "stable sort".

The default implementation of quick sort is not stable as


the elements sharing the same key have their orders
reversed due to line 4 of algorithm in other words efficient
implementations of Quicksort are not a stable sort
meaning that the relative order of equal sort items is not
preserved.
SPACE COMPLEXITY :

The in - place version of quick sort has a space


complexity of O(log n ) even in the worst case.

Quick sort with in-place and stable partitioning uses


only constant additional space before making any
recursive call .
Quick sort must store a constant amount of
information for each nested recursive call, since the
best case makes atmost O( log n ) nested recursive
calls it uses O(log n ) space .
IN – PLACE :
In-place means that the algorithm does
not use extra space for manipulating the
input but may require a small though
non-constant extra space for its
operation. Usually, this space is O(log n),
though sometimes anything in o(n)
(Smaller than linear) is allowed 
Quick sort is an in-place sorting
algorithm as it uses extra space only for
storing recursive function calls but not
for manipulating the inputs.
Graph obtained between the number of comparisons and
(nlogn) , n and n2 after test running the algorithm on 30
different inputs of sizes varying from 100 to 3000 with a
step size of 100 each
Why quicksort is better than
mergesort ?
There are certain reasons due to which quicksort is
better especially in case of arrays:

Auxiliary Space : Mergesort uses extra space,


quicksort requires little space and exhibits good cache
locality. Quick sort is an in-place sorting algorithm. In-
place sorting means no additional storage space is
needed to perform sorting. Merge sort requires a
temporary array to merge the sorted arrays and hence
it is not in-place giving Quick sort the advantage of
space.

Locality of reference : Quicksort in particular exhibits


good cache locality and this makes it faster than merge
sort in many cases like in virtual memory environment.

.
Worst Cases : The worst case of quicksort O(n2) can be
avoided by using randomized quicksort. It can be easily
avoided with high probability by choosing the right pivot.
Obtaining an average case behavior by choosing right
pivot element makes it improvise the performance and
becoming as efficient as Merge sort

Merge sort is better for large data


structures: Mergesort is a stable sort, unlike quicksort and
heapsort, and can be easily adapted to operate on linked
lists and very large lists stored on slow-to-access media
such as disk storage or network attached storage.
ADVANTAGES OF
USING QUICK SORT
(a) It is in-place since it uses only a small auxiliary stack.

(b)It requires only n (log n) time to sort n items.

(c)It has an extremely short inner loop.

(d)This algorithm has been subjected to a thorough


mathematical analysis, a very precise statement can be made
about performance issues.
DISADVANTAGES
OF QUICK SORT
(a) It is recursive. Especially, if recursion
is not available, the implementation is
extremely complicated.

(b) It requires quadratic (i.e., n2) time in


the worst-case.

(c)It is fragile, i.e. a simple mistake in


the implementation can go unnoticed and
cause it to perform badly.
REFERENCES :
Tutorialspoint.com
Geeksforgeeks.org
Medium.com
khanAcademy.org
stackoverflow.com
Introduction to Design and Analysis – Sarabase & A.V. Gelder

You might also like