Insertion Sort Explained — How It Works, When to Use It, and Why
Sorting is one of the most fundamental operations in computing. Every time you see a leaderboard, a contact list, or search results ranked by relevance, something sorted that data. Understanding how sorting works — and why different approaches exist — is a cornerstone of writing efficient software and a near-universal topic in technical interviews.
How Insertion Sort Actually Works — A Step-by-Step Walkthrough
Let's say you have this list of five numbers: [64, 25, 12, 22, 11]. Insertion sort treats the very first element as already sorted — a sorted list of one is trivially sorted. It then picks up the second element and asks: 'Where does this belong among the sorted elements to my left?' If it belongs further left, it shifts the existing elements to the right to make room, then drops the new element into its correct slot.
That process repeats for every remaining element. Each pass extends the sorted portion by one. By the time the algorithm reaches the last element, the entire array is sorted.
Here's what each pass looks like for [64, 25, 12, 22, 11]:
Start: [64 | 25, 12, 22, 11] — sorted portion is just [64] Pass 1: [25, 64 | 12, 22, 11] — 25 slides left of 64 Pass 2: [12, 25, 64 | 22, 11] — 12 slides all the way left Pass 3: [12, 22, 25, 64 | 11] — 22 slots between 12 and 25 Pass 4: [11, 12, 22, 25, 64] — 11 slides all the way left
The vertical bar shows the boundary between the sorted and unsorted sections. Notice it moves one step right after every pass.
public class InsertionSortBasic { public static void insertionSort(int[] numbers) { int arrayLength = numbers.length; // Start from index 1 — the first element is already 'sorted' on its own for (int i = 1; i < arrayLength; i++) { // Pick the current element to be placed in the right position int currentElement = numbers[i]; // j points to the last element in the already-sorted section int j = i - 1; // Shift elements in the sorted section to the right // as long as they are greater than the element we're placing while (j >= 0 && numbers[j] > currentElement) { numbers[j + 1] = numbers[j]; // slide this element one spot to the right j--; // move the comparison pointer left } // Drop currentElement into the gap we just created numbers[j + 1] = currentElement; // Show the array state after each pass so you can see progress System.out.print("After pass " + i + ": "); printArray(numbers); } } public static void printArray(int[] numbers) { System.out.print("["); for (int i = 0; i < numbers.length; i++) { System.out.print(numbers[i]); if (i < numbers.length - 1) System.out.print(", "); } System.out.println("]"); } public static void main(String[] args) { int[] scores = {64, 25, 12, 22, 11}; System.out.print("Original array: "); printArray(scores); System.out.println(); insertionSort(scores); System.out.println(); System.out.print("Sorted array: "); printArray(scores); } }
After pass 1: [25, 64, 12, 22, 11]
After pass 2: [12, 25, 64, 22, 11]
After pass 3: [12, 22, 25, 64, 11]
After pass 4: [11, 12, 22, 25, 64]
Sorted array: [11, 12, 22, 25, 64]
Time and Space Complexity — What the Big-O Numbers Really Mean
Big-O notation tells you how the runtime grows as the input size grows. Don't panic — we'll build the intuition from first principles.
In the worst case (a reverse-sorted array like [5, 4, 3, 2, 1]), every single element has to travel all the way to the beginning of the sorted section. For the second element, that's 1 comparison. For the third, it's 2. For the fourth, it's 3. For n elements, the total comparisons are roughly 1 + 2 + 3 + ... + (n-1), which equals n*(n-1)/2 — and that simplifies to O(n²). Quadratic. Not great for huge datasets.
In the best case (an already-sorted array like [1, 2, 3, 4, 5]), every element looks at its left neighbour once, sees it's already in place, and stops. That's exactly n-1 comparisons — O(n). Linear. This is genuinely useful in practice.
Space complexity is O(1) — insertion sort sorts the array in-place. It only ever uses one extra variable (currentElement) regardless of how big the array is. No extra arrays, no recursion stack. This makes it memory-efficient.
The average case is O(n²), which means insertion sort is not the right choice for sorting a million records. But for small arrays (typically under 20-30 elements) or nearly-sorted data, it's genuinely one of the fastest options available — which is why real-world sort implementations like Java's Arrays.sort() use insertion sort as a fallback for small sub-arrays.
public class InsertionSortComplexityDemo { // Sorts the array and counts exactly how many comparisons were made public static int insertionSortWithCount(int[] numbers) { int comparisonCount = 0; int arrayLength = numbers.length; for (int i = 1; i < arrayLength; i++) { int currentElement = numbers[i]; int j = i - 1; while (j >= 0 && numbers[j] > currentElement) { comparisonCount++; // count each time we compare two elements numbers[j + 1] = numbers[j]; j--; } // This comparison failed (or j went below 0), so count it too if j >= 0 if (j >= 0) comparisonCount++; numbers[j + 1] = currentElement; } return comparisonCount; } public static void main(String[] args) { // Best case: already sorted — O(n) behaviour int[] alreadySorted = {1, 2, 3, 4, 5, 6, 7, 8}; int bestCaseComparisons = insertionSortWithCount(alreadySorted.clone()); System.out.println("Best case (already sorted, n=8): " + bestCaseComparisons + " comparisons"); // Worst case: reverse sorted — O(n^2) behaviour int[] reverseSorted = {8, 7, 6, 5, 4, 3, 2, 1}; int worstCaseComparisons = insertionSortWithCount(reverseSorted.clone()); System.out.println("Worst case (reverse sorted, n=8): " + worstCaseComparisons + " comparisons"); // Average case: random order int[] randomOrder = {4, 2, 7, 1, 8, 3, 6, 5}; int averageCaseComparisons = insertionSortWithCount(randomOrder.clone()); System.out.println("Average case (random order, n=8): " + averageCaseComparisons + " comparisons"); System.out.println(); System.out.println("Theoretical worst case n*(n-1)/2 = " + (8 * 7 / 2) + " comparisons"); } }
Worst case (reverse sorted, n=8): 28 comparisons
Average case (random order, n=8): 14 comparisons
Theoretical worst case n*(n-1)/2 = 28 comparisons
Insertion Sort on Real Data — Sorting Strings and Custom Objects
Numbers are easy to compare, but real applications sort strings, dates, and custom objects. The logic is identical — only the comparison changes. In Java, strings have a natural ordering (alphabetical) via the compareTo() method, which returns a negative number if the first string comes before the second alphabetically.
For custom objects, you define what 'less than' means. Sorting a list of students by grade? By name? By age? All are valid — you just swap out the comparison. This is exactly what Comparator does in Java's standard library.
Understanding this makes you realise that insertion sort (and sorting algorithms in general) are really just frameworks for answering one repeated question: 'Does element A belong before element B?' The algorithm handles the rest.
The example below sorts an array of student names alphabetically, then sorts Student objects by their exam score — showing both use cases with the same algorithm logic.
public class InsertionSortStringsAndObjects { // --- Sorting strings alphabetically --- public static void insertionSortStrings(String[] names) { int length = names.length; for (int i = 1; i < length; i++) { String currentName = names[i]; int j = i - 1; // compareTo returns negative if currentName comes before names[j] alphabetically while (j >= 0 && names[j].compareTo(currentName) > 0) { names[j + 1] = names[j]; // shift name to the right j--; } names[j + 1] = currentName; // place currentName in its correct slot } } // --- Custom class representing a student --- static class Student { String name; int examScore; Student(String name, int examScore) { this.name = name; this.examScore = examScore; } public String toString() { return name + " (" + examScore + ")"; } } // --- Sorting Student objects by exam score, ascending --- public static void insertionSortByScore(Student[] students) { int length = students.length; for (int i = 1; i < length; i++) { Student currentStudent = students[i]; int j = i - 1; // Shift students with higher scores to the right while (j >= 0 && students[j].examScore > currentStudent.examScore) { students[j + 1] = students[j]; j--; } students[j + 1] = currentStudent; } } public static void main(String[] args) { // --- String sorting demo --- String[] playerNames = {"Zara", "Alice", "Marcus", "Bella", "Omar"}; System.out.println("Names before sort: " + java.util.Arrays.toString(playerNames)); insertionSortStrings(playerNames); System.out.println("Names after sort: " + java.util.Arrays.toString(playerNames)); System.out.println(); // --- Object sorting demo --- Student[] classStudents = { new Student("Zara", 88), new Student("Alice", 95), new Student("Marcus", 72), new Student("Bella", 95), // same score as Alice — notice stable ordering! new Student("Omar", 60) }; System.out.println("Students before sort by score:"); for (Student s : classStudents) System.out.println(" " + s); insertionSortByScore(classStudents); System.out.println("Students after sort by score:"); for (Student s : classStudents) System.out.println(" " + s); } }
Names after sort: [Alice, Bella, Marcus, Omar, Zara]
Students before sort by score:
Zara (88)
Alice (95)
Marcus (72)
Bella (95)
Omar (60)
Students after sort by score:
Omar (60)
Marcus (72)
Zara (88)
Alice (95)
Bella (95)
| Feature / Aspect | Insertion Sort | Selection Sort | Merge Sort |
|---|---|---|---|
| Best-case time complexity | O(n) — already sorted | O(n²) always | O(n log n) always |
| Worst-case time complexity | O(n²) | O(n²) | O(n log n) |
| Average-case time complexity | O(n²) | O(n²) | O(n log n) |
| Space complexity | O(1) — in-place | O(1) — in-place | O(n) — needs extra array |
| Stable sort? | Yes | No | Yes |
| Adaptive? (faster if nearly sorted) | Yes — genuinely faster | No | No |
| Online? (can sort as data arrives) | Yes | No | No |
| Best real-world use case | Small or nearly-sorted arrays | When writes are expensive | Large datasets requiring guaranteed O(n log n) |
| Used inside Java's Arrays.sort? | Yes — for small sub-arrays | No | Yes — as TimSort base |
🎯 Key Takeaways
- Insertion sort builds a sorted section one element at a time — like sliding a playing card into the right spot in your hand. The core operation is shifting, not swapping.
- Its best case is genuinely O(n) on already-sorted data — not a theoretical footnote. This adaptive behaviour is why TimSort (Java's built-in sort) uses it for small or nearly-sorted chunks.
- Stability matters in practice: insertion sort is stable because it uses strict greater-than (>), never displacing equal elements. Change > to >= and you silently break this guarantee.
- O(n²) makes it the wrong choice for large unsorted datasets — but for arrays under ~30 elements, its low overhead and cache-friendly access pattern make it faster than theoretically superior algorithms like merge sort.
⚠ Common Mistakes to Avoid
- ✕Mistake 1: Starting the outer loop at index 0 instead of index 1 — Symptom: the first element gets compared against numbers[-1], causing an ArrayIndexOutOfBoundsException or incorrect behaviour — Fix: always start at i = 1, because the element at index 0 is the initial 'sorted section' and needs no comparison.
- ✕Mistake 2: Using >= instead of > in the inner while condition — Symptom: the sort still works but is no longer stable — equal elements swap positions unnecessarily, and you do extra work for no reason — Fix: use strictly greater than (>) so that equal elements are never displaced, preserving their original relative order.
- ✕Mistake 3: Forgetting to save currentElement before the shifting loop — Symptom: by the time the while loop finishes shifting elements right, the value at numbers[i] has been overwritten and is lost forever, corrupting the array — Fix: always store currentElement = numbers[i] BEFORE the while loop starts, then write it back with numbers[j + 1] = currentElement after the loop.
Interview Questions on This Topic
- QWhat is the best-case time complexity of insertion sort and what input triggers it? Why doesn't selection sort share this property?
- QInsertion sort is described as 'online' — what does that mean and can you give a real-world scenario where this property is valuable?
- QIf insertion sort and merge sort both produce a stable sort, but merge sort is O(n log n), why would anyone ever choose insertion sort for production code?
Frequently Asked Questions
Is insertion sort faster than bubble sort?
Yes, in practice. Both are O(n²) in the worst case, but insertion sort does fewer comparisons on average because it stops comparing as soon as it finds the correct position. Bubble sort always scans the full unsorted portion. On nearly-sorted data, insertion sort is dramatically faster while bubble sort still does excessive work.
When should I actually use insertion sort instead of a built-in sort?
Rarely as a standalone choice for large data. But you'd reach for it when your array has fewer than 20-30 elements, when data is arriving in a stream and you need to keep a sorted list in real-time, or when memory is extremely limited (it uses O(1) space). For everything else, trust your language's built-in sort — which likely already uses insertion sort internally for small chunks anyway.
What does 'in-place' sorting mean and why does insertion sort qualify?
An in-place algorithm sorts the data by rearranging elements within the original array, using only a tiny fixed amount of extra memory regardless of input size. Insertion sort qualifies because it only ever allocates one extra variable (to hold the current element being placed). It never creates a second array. This contrasts with merge sort, which needs O(n) extra space to merge sorted halves.
Written and reviewed by senior developers with real-world experience across enterprise, startup and open-source projects. Every article on TheCodeForge is written to be clear, accurate and genuinely useful — not just SEO filler.