At its core, insertion sort serves as a sorting algorithm. It can place various unsorted elements in places that are most suitable for them at every single iteration. It would be fair to say that this algorithm works quite similar to how people sort cards in their hands. If you have experience playing card games, you will know that card players sort with the assumption that first cards are sorted already, after which they select the unsorted ones.

In case the unsorted card proves to be bigger compared to the card in the players hand, they have to place it on the right. Otherwise, they need to keep the card on the left hand side. Similarly, you have to place the rest of the unsorted cards and keep them in their respective places. The approach used by insertion sort is quite similar to this.

## The Basics of How Insertion Sort Works

Mentioned below are three steps that will give you a lowdown of the way insertion sort works:

• In the first step, the elements in question are compared with the elements adjacent to them
• If each comparison shows the element in consideration is can be used at a specific position, then there is space made for it. This is done be moving the position of other elements to the right.
• This procedure continues until every element present in the array find its rightful position

## Insertion Sort Characteristics

While this in-place sorting algorithm has a wide range of characteristics, there are three important ones that everyone must become familiar with.

1. Firstly, insertion sort algorithm is incredibly simple. Some would even say that it is the simplest one out there because of its straightforward implementation
2. If you are a programmer who deals with small data values regularly, the use of this algorithm will come in quite handy
3. The insertion sort algorithm’s nature is quite adaptive, making it ideal for partially sorted data sets

### What are Insertion Sort Algorithm Boundary Cases?

Insertion sort requires a great deal of time when it comes to sorting elements that are in a reverse order. However, if the elements are sorted already, it will not require much time.

### Are Insertion Sort Algorithms Stable?

Insertion sort algorithms are incredibly stable, especially when you compare them to other algorithms.

### When is the Best time to use the Insertion Sort Algorithm?

As mentioned earlier, insertion sort is often utilized whenever there is a small amount of elements. That said, it can also come in quite handy when an array of input does not need too much sorting and it has only a few misplaced elements.

### Which Approach Does Insertion Sort Follow?

The approach followed by the insertion sort algorithm is incremental, which is why it is incredibly popular among programmers sorting arrays.

## Binary Insertion Sort Explained

Programmers can utilize binary search for reducing the amount of comparisons present in regular insertion sort. Binary insertion sort makes use of search for finding the ideal location for inserting the chosen item at every single iteration. When it comes to regular insertion, the sorting uses O(i) (at ith iteration) during the worst case scenario.

We can use binary research for reducing it to this: O(logi). That said, however, the algorithm still happens to have a running time around O(n^2) for worst cases. This is due to the amount of swaps needed per insertion.

## Steps for Implementing Insertion Sort in Linked Lists

The steps mentioned below showcase how one can make use of the insertion sort algorithm in a linked list.

• Start by developing a sorted list, making sure it is empty
• Traverse the list you created and follow this step for each node
•  Enter the current node in the form of result or sorted list
• Finally, change the head of the linked list, making it head of sorted list, aka result list

## The Main Applications of Insertion Sort

Here are two of the most common scenarios when programmers utilize insertion sort.

• Firstly, they use it whenever there is an array with a few elements
• Insertion sort can also come in handy when there are only a small number of elements to sort.

## Insertion Sort Time Complexities

Here is a look at the time complexities you could encounter in insertion sort.

### Worst Case Complexity O (n2)

Imagine there is an array present in an ascending order, which you plant to sort in descending order. A case like this results in worst case complexity. In such a situation, you have to compare each element with other elements to ensure there are (n-1) comparisons for each nth element.

The total amount of comparisons will be n*(n-1) ~ n2

### Average Case Complexity O(n)

This type of complexity often takes place whenever an array’s elements are jumbled, which means they are neither in descending order nor in ascending order.

### Space Complexity

Space complexity becomes 0(1) whenever there is a implementation of an extra variable.

### Best Case Complexity

Whenever an array does not require any sorting, the amount of times the outer loop running for is n. On the other hand, the inner loop remains inactive and does not do any running. This means that the amount of comparisons will be n, resulting in a linear complexity.

## Analysis of Time Complexity

While there is no denying how efficient insertion sort is, but if one was to provide an array that is already sorted, to insertion sort, the algorithm will still carry out the other for the loop. This will require n steps for sorting an array of the n elements that were sorted already to begin with, essentially turning the best case time complexity into a linear n function.

An unsorted array requires an element for making comparisons with other elements, meaning that each element of n is compared with other n elements. It would also help to analyze other similar algorithms like Quick Sort, Merge Sort or Selection Sort and gauge their respective complexities.