- Part 1 – A Mega List of Known Algorithms – Sorting
- Part 2 – Mega List of Known Algorithms – Searching

# Introduction

Algorithms in Computing are a series of steps which can be followed in order to reach a certain goal. You can think of them as a function as well, but they have somewhat different semantics than the functions. Some algorithms are really simple, and some pretty much require a genius to even understand. There is a cost factor to algorithms as well e.g. how efficient or performant one algorithm is with respect to another algorithm. Let’s discuss some of the known algorithms in computing.

# Sorting algorithms

These are the algorithms which sort a certain list in some order as specified by the programmer or user. One can sort a list of objects into ascending or descending order based on some independent criteria. There are a lot of ways to achieve this ascending or descending behaviour. Soring also means that it provides some logical order to a sequence of objects. Few of the sorting algorithms are given below.

## Selection Sort

It is the simplest sorting algorithm. Its first step is to find the smallest item in the list. It then puts the smallest one as the first entry of the whole sequence. It then moves to finding the next smallest object in the sequence. Once it has found the next smallest, it will replace the second item in the sequence with this new smallest. It will then continue the same process with the whole list until it has no more smallest object remain in the list. This is called selection sort. Let’s move on to another sorting algorithm.

## Insertion Sort

In insertion sort, one picks up a single entry from the sequence and then inserts it before some larger than itself. Once the index moves to the far right where the largest object will be pushed, we get a sorted array or sequence. During the sorting process the items to the left of moving index might be sorted wholly or partially, but once the index reaching the right end, we will know that the whole sequence is sorted. It is one of the fastest algorithms if the sequence or an array has already a certain arrangement in place.

## Shellsort

This algorithm is based on selection sort. It is one of the fastest algorithms around, at least much better than selection sort. Selection sort compares two adjacent items i.e. finds the smallest among the adjacents and does the exchanges based on that. The shellsort algorithm tries to compare non-adjacent items. It compares an item with an item which is an offset away from the item at hand to find the smallest. It produces partially sorted arrays which you can later completely sort using insertion sort. It uses a method called sorted sub-sequencing or increment sequence. You can use different values of increment or offset to create sub-arrays, and each will have its own efficiency.

## Mergesort

It is based on the concept of merging, as the name suggests. It forms the end result by merging small results. So at first, you need to create a small set of sorted arrays which you can merge. Mergesort recursively merge two subsets of sorted arrays until it reaches a single set. The procedure is simple, divide your array into two halves and then recursively mergesort these two halves, and at the end (when you can’t halve anymore) you merge all results. Its complexity is NlogN in terms of time, and N in terms of space. Here N is the number of items in the array or sequence. This algorithm is also known as divide-and-conquer. There are two ways to do that, one being top-down mergesort, and the other being bottom-up mergesort.

## Quicksort

Quicksort is most widely used, due to ease of implementation, and its ease of handling different types of inputs. It is very fast algorithm as compared to what we have learnt so far. It is also based on the divide-and-conquer paradigm. It splits the array into two subarrays. It then sorts the subarrays independent of each other. Once both subarrays are completely sorted, they are rearranged, and we would have a sorted array.

## Bubble Sort

Bubble sort compares each pair of adjacent elements from the beginning of the list, and if they are in the reverse order then it swaps them. And if at least one swap has been done then repeat the process by passing on the element to the right. If there are n elements in the list, then bubble sort needs (n-1) passes. That is the reason that bubble sort has a bad performance for large values of n. The larger numbers move to the end of the list like a bubble from the other end of the list.

Following is the pseudo code for bubble sort.

1 2 3 4 5 |
SEQUENTIAL BUBBLESORT (A) for i ← 0 to (length [A]-2) do for j ← 0 to ((length [A]-2)- i) do If A[j+1] < A[j] then Exchange A[j] ↔ A[j+1] //swap |

## Heapsort

The selection sort algorithm discussed above is the basis of heapsort algorithm. Heapsort first stores the elements of the list one by one on the heap (essentially a priority queue). Then it removes the items from the heap in a sorted order. Heap is actually a special type of binary tree structure. In this type of binary tree, the root node is guaranteed to be the largest (or smallest) element. Heapsort is a much more efficient version of selection sort. Following is the pseudo code for heap sort.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
Heapsort(L) { CreateHeap(L) for i <- length(L) downto 2 { exchange L[1] <-> L[i] heapsize <- heapsize -1 Heapit(L, 1) } CreateHeap(L) { heapsize <- length(L) for i <- floor( length/2 ) downto 1 Heapit(L, i) } Heapit(L, i) { le <- left(i) ri <- right(i) if (le<=heapsize) and (L[le]>L[i]) largest <- le else largest <- i if (ri<=heapsize) and (L[ri]>L[largest]) largest <- ri if (largest != i) { exchange L[i] <-> L[largest] Heapit(L, largest) } } |

There are other algorithms as well e.g. bucket sort, shaker sort, radix sort, introsort, timsort etc. Also, be sure to tell about others in the comments.