Homeβ€Ί DSAβ€Ί Bubble Sort Time Complexity: Best, Average and Worst Case

Bubble Sort Time Complexity: Best, Average and Worst Case

Where developers are forged. Β· Structured learning Β· Free forever.
πŸ“ Part of: Complexity Analysis β†’ Topic 6 of 6
Understand bubble sort time complexity: O(n) best case with early exit, O(nΒ²) average and worst case.
πŸ§‘β€πŸ’» Beginner-friendly β€” no prior DSA experience needed
In this tutorial, you'll learn:
  • Bubble sort: O(nΒ²) worst and average, O(n) best case (optimised version only). O(1) space β€” sorts in-place.
  • The early-exit optimisation (swapped flag) is what enables O(n) best case. Without it, best case is also O(nΒ²).
  • Bubble sort is stable β€” equal elements maintain their relative order β€” but so are Timsort and merge sort, which are also O(n log n).
✦ Plain-English analogy ✦ Real code with output ✦ Interview questions
⚑ Quick Answer
Bubble sort compares adjacent pairs and swaps them if they're out of order, making multiple passes until nothing swaps. Heavy elements sink, light ones rise. The time complexity is O(nΒ²) because n passes Γ— n comparisons per pass. The smart optimisation: if a complete pass has no swaps, the array is sorted β€” stop. This makes the best case (already sorted) O(n) instead of O(nΒ²).

Bubble sort appears in every algorithms course and almost no production codebase. Its value is pedagogical: it's the first algorithm that makes O(nΒ²) complexity intuitive. The optimised variant with early exit is worth implementing cleanly because it demonstrates the most fundamental algorithm optimisation β€” detect the done condition early and stop.

Bubble Sort: Naive and Optimised Implementations

bubble_sort.py Β· PYTHON
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051
from typing import List

def bubble_sort_naive(arr: List[int]) -> List[int]:
    """
    Naive bubble sort β€” always does nΒ² comparisons.
    Time:  O(nΒ²) all cases
    Space: O(1)
    """
    arr = arr.copy()
    n = len(arr)
    for i in range(n):
        for j in range(n - 1):
            if arr[j] > arr[j + 1]:
                arr[j], arr[j + 1] = arr[j + 1], arr[j]
    return arr


def bubble_sort_optimised(arr: List[int]) -> List[int]:
    """
    Optimised bubble sort with two improvements:
    1. After pass i, the last i elements are in place β€” skip them.
    2. If no swap in a full pass, array is sorted β€” exit early.

    Time:  O(n) best case (already sorted)
           O(nΒ²) average and worst case (random / reverse sorted)
    Space: O(1)
    """
    arr = arr.copy()
    n = len(arr)
    for i in range(n):
        swapped = False
        for j in range(0, n - i - 1):   # Each pass is shorter
            if arr[j] > arr[j + 1]:
                arr[j], arr[j + 1] = arr[j + 1], arr[j]
                swapped = True
        if not swapped:   # No swaps β†’ already sorted β†’ done
            break
    return arr


# Already sorted β€” O(n): only 1 pass, exits immediately
print(bubble_sort_optimised([1, 2, 3, 4, 5]))
# [1, 2, 3, 4, 5]  β€” 1 pass

# Random β€” O(nΒ²): multiple passes needed
print(bubble_sort_optimised([64, 34, 25, 12, 22, 11, 90]))
# [11, 12, 22, 25, 34, 64, 90]

# Reverse sorted β€” worst case O(nΒ²): maximum swaps every pass
print(bubble_sort_optimised([5, 4, 3, 2, 1]))
# [1, 2, 3, 4, 5]  β€” all n passes required
β–Ά Output
[1, 2, 3, 4, 5]
[11, 12, 22, 25, 34, 64, 90]
[1, 2, 3, 4, 5]

# Comparison of approaches:
# Input: [1,2,3,4,5] (already sorted, n=5)
# Naive optimised: n*(n-1)/2 = 10 comparisons
# Early-exit optimised: n-1 = 4 comparisons (stops after 1 pass)
CaseTimeWhen It Occurs
Best case (optimised)O(n)Array already sorted β€” early exit after 1 pass
Best case (naive)O(nΒ²)No early exit regardless of input
Average caseO(nΒ²)Random order
Worst caseO(nΒ²)Reverse sorted β€” maximum swaps every pass
Space complexityO(1)Sorts in-place β€” no auxiliary array
Stable?YesEqual elements are never swapped

🎯 Key Takeaways

  • Bubble sort: O(nΒ²) worst and average, O(n) best case (optimised version only). O(1) space β€” sorts in-place.
  • The early-exit optimisation (swapped flag) is what enables O(n) best case. Without it, best case is also O(nΒ²).
  • Bubble sort is stable β€” equal elements maintain their relative order β€” but so are Timsort and merge sort, which are also O(n log n).
  • Never use bubble sort on more than a few hundred elements in production. Language built-ins are always faster.

⚠ Common Mistakes to Avoid

  • βœ•Using the naive O(nΒ²) best-case implementation when the optimised version is the same code complexity β€” the swapped flag costs nothing and enables the O(n) best case.
  • βœ•Applying bubble sort to large inputs β€” at n=100,000, O(nΒ²) means 10 billion operations. Python's sorted() (Timsort) handles this in O(n log n).
  • βœ•Thinking bubble sort's stability makes it production-viable β€” Timsort and merge sort are also stable and O(n log n). Stability alone doesn't justify O(nΒ²).

Interview Questions on This Topic

  • QWhat is bubble sort's time complexity in best, average, and worst case?
  • QHow does the early-exit optimisation change bubble sort's best-case complexity?
  • QCompare bubble sort to insertion sort β€” when would you prefer one over the other?

Frequently Asked Questions

What is the time complexity of bubble sort?

O(nΒ²) for average and worst case. The best case is O(n) only with the optimised version that includes an early-exit flag β€” when no swaps occur in a full pass, the array is already sorted and the algorithm exits. The naive version without this optimisation is O(nΒ²) even in the best case.

Is bubble sort stable?

Yes. Bubble sort only swaps adjacent elements when they're in the wrong order. Equal elements are never swapped, so their relative order is preserved. This makes it stable β€” but Timsort and merge sort are also stable and O(n log n), so stability is not a reason to choose bubble sort.

πŸ”₯
Naren Founder & Author

Developer and founder of TheCodeForge. I built this site because I was tired of tutorials that explain what to type without explaining why it works. Every article here is written to make concepts actually click.

← PreviousSpace Complexity: How to Analyze Memory Usage in Algorithms
Forged with πŸ”₯ at TheCodeForge.io β€” Where Developers Are Forged