0% found this document useful (0 votes)
6 views

Data Structure Operations

Uploaded by

maang9598
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Data Structure Operations

Uploaded by

maang9598
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Data Structure Operations

Data structure operations are the methods used to manipulate the data in a data structure. The most
common data structure operations are:

• Traversal
Traversal operations are used to visit each node in a data structure in a specific order. This
technique is typically employed for printing, searching, displaying, and reading the data stored in
a data structure.

• Insertion
Insertion operations add new data elements to a data structure. You can do this at the data
structure's beginning, middle, or end.

• Deletion
Deletion operations remove data elements from a data structure. These operations are typically
performed on nodes that are no longer needed.

• Search
Search operations are used to find a specific data element in a data structure. These operations
typically employ a compare function to determine if two data elements are equal.

• Sort
Sort operations are used to arrange the data elements in a data structure in a specific order. This
can be done using various sorting algorithms, such as insertion sort, bubble sort, merge sort, and
quick sort.

• Merge
Merge operations are used to combine two data structures into one. This operation is typically
used when two data structures need to be combined into a single structure.

• Copy
Copy operations are used to create a duplicate of a data structure. This can be done by copying
each element in the original data structure to the new one.

Sorting Algorithms
A Sorting Algorithm is used to rearrange a given array or list of elements in an order. Sorting is provided
in library implementation of most of the programming languages.

Comparison Based : Selection Sort, Bubble Sort, Insertion Sort, Merge Sort, Quick Sort, Heap Sort, Cycle
Sort, 3-way Merge Sort
Non Comparison Based : Counting Sort, Radix Sort, Bucket Sort, TimSort, Comb Sort, Pigeonhole Sort
Hybrid Sorting Algorithms : IntroSort, Tim Sort

1. Insertion Sort Algorithm


Insertion sort is a simple sorting algorithm that works by iteratively inserting each element of an
unsorted list into its correct position in a sorted portion of the list. It is like sorting playing cards in your
hands. You split the cards into two groups: the sorted cards and the unsorted cards. Then, you pick a
card from the unsorted group and put it in the right place in the sorted group.

• We start with second element of the array as first element in the array is assumed to be sorted.

• Compare second element with the first element and check if the second element is smaller then
swap them.

• Move to the third element and compare it with the first two elements and put at its correct
position

• Repeat until the entire array is sorted.

arr = {23, 1, 10, 5, 2}

Initial:

• Current element is 23

• The first element in the array is assumed to be sorted.

• The sorted part until 0th index is : [23]

First Pass:

• Compare 1 with 23 (current element with the sorted part).

• Since 1 is smaller, insert 1 before 23 .

• The sorted part until 1st index is: [1, 23]

Second Pass:

• Compare 10 with 1 and 23 (current element with the sorted part).

• Since 10 is greater than 1 and smaller than 23 , insert 10 between 1 and 23 .


• The sorted part until 2nd index is: [1, 10, 23]

Third Pass:

• Compare 5 with 1 , 10 , and 23 (current element with the sorted part).

• Since 5 is greater than 1 and smaller than 10 , insert 5 between 1 and 10

• The sorted part until 3rd index is : [1, 5, 10, 23]

Fourth Pass:

• Compare 2 with 1, 5, 10 , and 23 (current element with the sorted part).

• Since 2 is greater than 1 and smaller than 5 insert 2 between 1 and 5 .

• The sorted part until 4th index is: [1, 2, 5, 10, 23]

Final Array:

• The sorted array is: [1, 2, 5, 10, 23]

Complexity Analysis of Insertion Sort :


Time Complexity of Insertion Sort

• Best case: O(n), If the list is already sorted, where n is the number of elements in the list.

• Average case: O(n^2), If the list is randomly ordered

• Worst case: O(n^2), If the list is in reverse order

Space Complexity of Insertion Sort

• Auxiliary Space: O (1), Insertion sort requires O (1) additional space, making it a space-efficient
sorting algorithm.

Advantages of Insertion Sort:

• Simple and easy to implement.

• Stable sorting algorithm.

• Efficient for small lists and nearly sorted lists.

• Space-efficient as it is an in-place algorithm.

• Adoptive. the number of inversions is directly proportional to number of swaps. For example, no
swapping happens for a sorted array and it takes O(n) time only.

Disadvantages of Insertion Sort:

• Inefficient for large lists.

• Not as efficient as other sorting algorithms (e.g., merge sort, quick sort) for most cases.
Bubble Sort Algorithm
Bubble Sort is the simplest sorting algorithm that works by repeatedly swapping the adjacent elements
if they are in the wrong order. This algorithm is not suitable for large data sets as its average and worst-
case time complexity are quite high.

• We sort the array using multiple passes. After the first pass, the maximum element goes to end
(its correct position). Same way, after second pass, the second largest element goes to second
last position and so on.

• In every pass, we process only those elements that have already not moved to correct position.
After k passes, the largest k elements must have been moved to the last k positions.

• In a pass, we consider remaining elements and compare all adjacent and swap if larger element
is before a smaller element. If we keep doing this, we get the largest (among the remaining
elements) at its correct position.

Complexity Analysis of Bubble Sort:


Time Complexity: O(n2)
Auxiliary Space: O(1)

Advantages of Bubble Sort:

• Bubble sort is easy to understand and implement.

• It does not require any additional memory space.

• It is a stable sorting algorithm, meaning that elements with the same key value maintain their
relative order in the sorted output.

Disadvantages of Bubble Sort:

• Bubble sort has a time complexity of O(n2) which makes it very slow for large data sets.

• Bubble sort is a comparison-based sorting algorithm, which means that it requires a comparison
operator to determine the relative order of elements in the input data set. It can limit the
efficiency of the algorithm in certain cases

Merge Sort
Merge sort is a sorting algorithm that follows the divide-and-conquer approach. It works by recursively
dividing the input array into smaller subarrays and sorting those subarrays then merging them back
together to obtain the sorted array.

In simple terms, we can say that the process of merge sort is to divide the array into two halves, sort
each half, and then merge the sorted halves back together. This process is repeated until the entire array
is sorted.
How does Merge Sort work?

Merge sort is a popular sorting algorithm known for its efficiency and stability. It follows the divide-and-
conquer approach to sort a given array of elements.

Here’s a step-by-step explanation of how merge sort works:

1. Divide: Divide the list or array recursively into two halves until it can no more be divided.

2. Conquer: Each subarray is sorted individually using the merge sort algorithm.

3. Merge: The sorted subarrays are merged back together in sorted order. The process continues
until all elements from both subarrays have been merged.

Divide:

• [38, 27, 43, 10] is divided into [38, 27 ] and [43, 10] .

• [38, 27] is divided into [38] and [27] .

• [43, 10] is divided into [43] and [10] .

Conquer:

• [38] is already sorted.

• [27] is already sorted.

• [43] is already sorted.

• [10] is already sorted.

Merge:

• Merge [38] and [27] to get [27, 38] .


• Merge [43] and [10] to get [10,43] .

• Merge [27, 38] and [10,43] to get the final sorted list [10, 27, 38, 43]

Therefore, the sorted list is [10, 27, 38, 43] .

Complexity Analysis of Merge Sort:


• Time Complexity:

o Best Case: O(n log n), When the array is already sorted or nearly sorted.

o Average Case: O(n log n), When the array is randomly ordered.

o Worst Case: O(n log n), When the array is sorted in reverse order.

• Auxiliary Space: O(n), Additional space is required for the temporary array used during merging.

Advantages of Merge Sort:


• Stability: Merge sort is a stable sorting algorithm, which means it maintains the relative order of
equal elements in the input array.

• Guaranteed worst-case performance: Merge sort has a worst-case time complexity of O(N logN)
, which means it performs well even on large datasets.

• Simple to implement: The divide-and-conquer approach is straightforward.

• Naturally Parallel: We independently merge subarrays that makes it suitable for parallel
processing.

Disadvantages of Merge Sort:

• Space complexity: Merge sort requires additional memory to store the merged sub-arrays
during the sorting process.

• Not in-place: Merge sort is not an in-place sorting algorithm, which means it requires additional
memory to store the sorted data. This can be a disadvantage in applications where memory
usage is a concern.
• Slower than QuickSort in general. QuickSort is more cache friendly because it works in-place.

Quick Sort
QuickSort is a sorting algorithm based on the Divide and Conquer that picks an element as a pivot and
partitions the given array around the picked pivot by placing the pivot in its correct position in the sorted
array.

How does QuickSort Algorithm work?

QuickSort works on the principle of divide and conquer, breaking down the problem into smaller sub-
problems.

There are mainly three steps in the algorithm:


1. Choose a Pivot: Select an element from the array as the pivot. The choice of pivot can vary (e.g.,
first element, last element, random element, or median).

2. Partition the Array: Rearrange the array around the pivot. After partitioning, all elements
smaller than the pivot will be on its left, and all elements greater than the pivot will be on its
right. The pivot is then in its correct position, and we obtain the index of the pivot.

3. Recursively Call: Recursively apply the same process to the two partitioned sub-arrays (left and
right of the pivot).

4. Base Case: The recursion stops when there is only one element left in the sub-array, as a single
element is already sort

Complexity Analysis of Quick Sort


Time Complexity:

• Best Case: (Ω(n log n)), Occurs when the pivot element divides the array into two equal halves.

• Average Case (θ(n log n)), On average, the pivot divides the array into two parts, but not
necessarily equal.

• Worst Case: (O(n²)), Occurs when the smallest or largest element is always chosen as the pivot
(e.g., sorted arrays).

• Auxiliary Space: O(n), due to recursive call stack

Advantages of Quick Sort


• It is a divide-and-conquer algorithm that makes it easier to solve problems.

• It is efficient on large data sets.

• It has a low overhead, as it only requires a small amount of memory to function.


• It is Cache Friendly as we work on the same array to sort and do not copy data to any auxiliary
array.

• Fastest general purpose algorithm for large data when stability is not required.

• It is tail recursive and hence all the tail call optimization can be done.

Disadvantages of Quick Sort


• It has a worst-case time complexity of O(n2), which occurs when the pivot is chosen poorly.

• It is not a good choice for small data sets.

• It is not a stable sort, meaning that if two elements have the same key, their relative order will
not be preserved in the sorted output in case of quick sort, because here we are swapping
elements according to the pivot’s position (without considering their original positions).

You might also like