0% found this document useful (0 votes)
26 views

DSAL-211-Lecture 6 - Sorting Algorithms

The document summarizes sorting algorithms including bubble sort, selection sort, insertion sort, quick sort, and merge sort. It provides definitions of sorting terminology, overviews of the sorting problem and example algorithms. For each algorithm, it describes the approach, provides pseudocode, and analyzes time complexity, which is O(n2) for bubble, selection, and insertion sort, and O(n log n) on average for quick and merge sort. Partitioning and merging are key steps in quick and merge sort respectively.

Uploaded by

Kutemwa Mithi
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views

DSAL-211-Lecture 6 - Sorting Algorithms

The document summarizes sorting algorithms including bubble sort, selection sort, insertion sort, quick sort, and merge sort. It provides definitions of sorting terminology, overviews of the sorting problem and example algorithms. For each algorithm, it describes the approach, provides pseudocode, and analyzes time complexity, which is O(n2) for bubble, selection, and insertion sort, and O(n log n) on average for quick and merge sort. Partitioning and merging are key steps in quick and merge sort respectively.

Uploaded by

Kutemwa Mithi
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

DSAL-211 LECT 5: Sorting

Algorithms
By
W.G. Gondwe
Ag. ICT Manager/Lecturer, Computer Networks
CS&IT Department
Malawi University of Science & Technology
Overview
• Understanding the Sorting Problem
• Sorting Terminology
• Sorting Algorithms
• Bubble Sort
• Selection Sort
• Insertion Sort
• Quick Sort
• Merge Sort (recursive)
The Sorting Problem
• Sorting: putting elements (of a list) in a certain desired order
• Sorting orders can be numeric or lexicographical
• One of the most important and ubiquitous computational problem
• Widely used in applications/operation systems
• Used to pre-process input for other algorithms (e.g. binary search)
• Example sorting algorithms: bubble, selection, insertion, quick, heap,
comb, merge, bucket, shell
Sorting Terminology/Definitions
• Swap – switching element positions
• In-place sorting - elements of a list are sorted in their place and minimal
extra space is required as the algorithm executes
• Internal/External sorting – Internal sorting utilizes available memory,
external sorting requires storage of items on secondary storage
• Stability – a sorting algorithm is stable if the relative positions of elements
with the same value (sorting key) is maintained i.e. any two items are not
swapped if they are equal
• Comparison-based sorting – examines elements only by using a
comparison operator e.g. <, >, ≥, ≤ (compare with search engine sorting)
Bubble Sort
• A simple, in-place, comparison-based algorithm
• It is an iterative sorting algorithm that compares a pair of elements at
a time
• A pair out of order is swapped, otherwise the algorithm proceeds to
compare the next pair
• The algorithm continues to pass through the list until no swaps are
required
• Effectively, each out-of-order element “bubbles” to its rightful
position on each pass
Bubble Sort cont..
• Simulation: sort the list (1,5,8,3,4,7) in ascending order
1st pass: (1,5,8,3,4,7) : 1 < 5 (no swap)
2nd pass: (1,5,8,3,4,7) : 5 < 8 (no swap)
3rd pass: (1,5,8,3,4,7) -> (1,5,3,8,4,7) : 8 > 3 (swap(8,3))
4th pass: (1,5,3,8,4,7) -> (1,5,3,4,8,7) : 4 < 8 (swap(8,4))
5th pass: (1,5,3,4,8,7) -> (1,5,3,4,7,8) : 7 < 8 (swap(8,7))
etc…

i.e. on first pass, largest item in the list bubbles to the end of the list
(ascending order)
sub swap(A[i],A[k])
temp := A[i]
Bubble Sort cont… A[i] := A[j]
A[j] := temp
end sub
Algorithm:
procedure bubbleSort(list: A) 1
n := size(A) 2
for i:=0 to n-1 3
for j:=1 to n-1 4
if(A[j-1] > A[j]) then 5
swap(A[j-1], A[j]) 6
end if 7
end for 8
end for 9
end procedure 10
Bubble Sort cont…
• Performance/Complexity
• Worst-case: O(n2)
• Avg. case: O(n2)
• Best case: O(n)

Worst case complexity reasoning:


In the worst case (reverse-sorted list), we get (n-1) + (n-2) + (n-3) + …+1 swaps
Hence quadratic
Selection Sort
• In-place, comparison-based algorithm
• It divides the list into two, sorted and unsorted sub-lists (sorted list
has size 0 at the onset)
• Proceeds by finding the smallest (or largest in the case of descending
order) item in the list and inserting it into the sorted sub-list
• Since each selection is the desired element for the position, a sorted
list is produced on the last iteration
Selection Sort cont..
• Simulation: sort the list (1,5,8,3,4,7) in ascending order
1st pass: (1,5,8,3,4,7) -> (1,5,8,3,4,7) : 1 selected, right position
2nd pass: (1,5,8,3,4,7) : (1,3,8,5,4,7) : 3 selected, swapped with 5
3rd pass: (1,3,8,5,4,7) -> (1,3,4,5,8,7) : 4 selected, swapped with 8
4th pass: (1,3,4,5,8,7) -> (1,3,4,5,8,7) : 5 selected, right position
5th pass: (1,3,4,5,8,7) -> (1,3,4,5,7,8) : 7 selected, swapped with 8
List sorted!
sub swap(A[i],A[k])
temp := A[i]
Selection Sort cont… A[i] := A[j]
A[j] := temp
end sub
Algorithm:
procedure selectionSort(list: A) 1
n := size(A) 2
for j:=0 to n-1 3
min_i := j 4
for i := j+1 to n-1 5
if(A[i] < A[min_i]) then 6
min_i = i 7
end if 8
end for 9
if(min_i != j) then 10
swap(A[i],a[min_i]) 11
end if 12
end for 13
end procedure 14
Selection Sort cont…
• Performance/Complexity
• Worst-case: O(n2)
• Avg. case: O(n2)
• Best case: O(n2)

Worst case complexity reasoning:


In the worst case (reverse-sorted list), we get (n-1) + (n-2) + (n-3) + …+1
comparisons
Hence quadratic
Insertion Sort
• In-place, iterative sorting algorithm
• Divides input into two sub-lists (sorted and unsorted)
• Sorted sub-list initially has the first element
• Elements are fetched from the unsorted sub-list and ‘inserted’ into
the right position in the sorted sub-list
• Similar to sorting playing cards (by inserting each card into its place)
• Final iteration produces a sorted list (all unsorted items inserted)
Insertion Sort cont..
• Algorithm:
procedure insertionSort(list: A) 1
i := 1 2
n := size(A) 3
while i < n 3
j:= i 4
while j>0 && A[j-1] > A[j]) 5
swap(A[j], A[j-1]) 6
j := j-1 7
end while 8
i := i+1
end while 9
end procedure 10
Divide & Conquer Sorting
• Recap: Divide and conquer is an algorithm design paradigm that
solves a problem by recursively solving smaller versions of the
problem
• Smaller solutions are then combined to form a full solution
• Some sorting algorithms use the divide and conquer approach
• We take a closer look at Quick Sort and Merge Sort
Quick Sort
• A divide-and-conquer sorting algorithm
• Works by identifying an arbitrary pivot element and partitioning the
list around the pivot
• Elements smaller than pivot are put to the left and elements greater
than pivot are put on the right (assuming ASC order)
• The partitioning step is the most critical (conquering)
Quicksort cont..
Illustration:

Diagram courtesy of https://www.geeksforgeeks.org/quick-sort/


Quick Sort
Algorithm:
procedure quicksort(A, lo, hi)
if(lo < hi) then
p := partition(A, lo, hi)
quicksort(A, lo, p - 1 )
quicksort(A, p + 1, hi)
end if
end procedure
Hint: Index i keeps track of the
Quick Sort right position to insert pivot
element

Algorithm (partitioning):
procedure partition(A, lo, hi)
pivot := A[hi]
i := lo – 1
for j := lo to hi – 1
if(A[j] < pivot) then
i := i + 1
swap(A[i],A[j])
end if
end for
if A[hi] < A[i + 1] then
swap(A[i + 1],A[hi])
end if
return i + 1 // return new pivot index
end procedure
Quicksort cont…
Complexity:
• Best-case: O(n log n)
• Average case: O(n log n)
• Worst-case: O(n2)
• On average, divide step complexity is log n and conquer (partitioning) complexity is n
(linear), giving a log-linear complexity
• Best case complexity can degenerate to O(n) for some partitioning schemes.
• Choice of pivot: can be first/last element, median or random element
Merge Sort
• Another divide-and-conquer sorting algorithm
• Recursively partitions a list into two (divide) and merges them
(conquer)
• Merge: combine two lists such that the resultant list is sorted
• The merging step is the most critical (conquering)
Merge sort cont..
Illustration:

Diagram courtesy of https://en.wikipedia.org/wiki/Merge_sort


Merge sort cont..
Algorithm:
procedure merge_sort(Arr[], iBegin, iEnd, A[])
if(iEnd - iBegin < 2) then
return
end if
iMiddle := (iEnd + iBegin) / 2
merge_sort(A, iBegin, iMiddle, Arr)
merge_sort(A, iMiddle, iEnd, Arr)
TopDownMerge(Arr, iBegin, iMiddle, iEnd, A)
End procedure
Merge sort cont..
Algorithm (merge):
procedure merge(Arr[], iBegin, iMiddle, iEnd, B[])
i = iBegin, j = iMiddle
for (k = iBegin; k < iEnd; k++)
if (i < iMiddle && (j >= iEnd || Arr[i] <= Arr[j])) then
B[k] := Arr[i]
i := i + 1
else
B[k] := Arr[j]
j := j + 1
End if
End for
End procedure
Merge sort cont…
Complexity:
• Best-case: O(n log n)
• Average case: O(n log n)
• Worst-case: O(n log n)
• On average, divide step complexity is log n and conquer (partitioning) complexity is n
(linear), giving a log-linear complexity
• Best case complexity can degenerate to O(n) for variants of merge schemes
Merge sort cont
• Theoretically, search complexity cannot get better than n log n in the
average case
• i.e. Merge sort presents one of the most efficient search algorithms
Questions?

You might also like