Skip to content
Home Java Stack and Queue in Python Using Lists — How They Work and When to Use Each

Stack and Queue in Python Using Lists — How They Work and When to Use Each

Where developers are forged. · Structured learning · Free forever.
📍 Part of: Collections → Topic 7 of 21
Stack and Queue in Python using lists explained with real-world analogies, runnable code, gotchas, and interview questions.
⚙️ Intermediate — basic Java knowledge assumed
In this tutorial, you'll learn
Stack and Queue in Python using lists explained with real-world analogies, runnable code, gotchas, and interview questions.
  • A Stack uses LIFO order — list.append() to push and list.pop() to pop, both O(1). The right end of the list is the top. Never touch the left end.
  • A plain Python list-based Queue is correct but slow — list.pop(0) is O(n). For any production Queue, use collections.deque with appendleft()/popleft() or append()/popleft() for true O(1) performance.
  • Reach for a Stack when your problem involves backtracking, unwinding, or reversing (undo systems, DFS, bracket matching). Reach for a Queue when order of arrival matters (task scheduling, BFS, rate limiting).
✦ Plain-English analogy ✦ Real code with output ✦ Interview questions
Quick Answer
  • Stack: last in, first out. push/pop both at the same end. O(1) with a Python list.
  • Queue: first in, first out. add at back, remove from front. O(1) enqueue but O(n) dequeue with a plain list.
  • Use Stack for backtracking, undo, DFS, expression parsing. Use Queue for scheduling, BFS, rate limiting.
  • The O(n) pop(0) cost on lists is the single biggest gotcha — use collections.deque for production queues.
  • Both structures are about enforcing discipline: restricting where you add/remove to prevent ordering bugs.
🚨 START HERE
Stack and Queue Triage Cheat Sheet
Fast commands to diagnose common production issues with ordered data structures.
🟡Queue processing freezes — single core at 100%.
Immediate ActionCheck if the queue uses list.pop(0). If yes, it is O(n) shifting — replace with deque.popleft().
Commands
py-spy top --pid <pid> (Python profiler — shows hot function)
grep -rn 'pop(0)\|insert(0,' src/ (find all O(n) queue operations)
Fix NowReplace list with collections.deque. Change pop(0) to popleft(). Change insert(0, x) to appendleft(x).
🟡Stack/queue returns items in wrong order — logic bug.
Immediate ActionPrint the data structure after each push/pop to verify LIFO or FIFO behavior matches expectations.
Commands
Add debug logging: print(f'After {op}: {list(queue)}')
grep -rn 'append.*pop(0)\|insert(0.*pop()' src/ (find mixed-end operations)
Fix NowFor Stack: use append() + pop() only. For Queue: use append() + popleft() only. Never mix ends.
🔴OOM kill — queue consuming all available memory.
Immediate ActionCheck if the queue has a bounded size. If not, set maxlen on deque.
Commands
jcmd <pid> GC.heap_dump /tmp/heap.hprof (heap dump for Java) or tracemalloc (Python)
grep -rn 'deque()' src/ (find unbounded deque instances)
Fix NowSet maxlen: deque(maxlen=10000). When full, appendleft() automatically drops the oldest item.
Production IncidentThe Print Queue That Froze the Billing SystemA hospital billing system used a Python list as a queue to process invoice print jobs. During peak hours with 80,000 queued invoices, the system froze for 45 seconds per dequeue operation, causing a cascade of timeouts across downstream services.
SymptomInvoice processing latency spiked from 2ms to 45 seconds during peak hours. The billing service CPU hit 100% on a single core. Downstream payment reconciliation services timed out waiting for invoice confirmations. No exceptions were thrown — the code was correct, just catastrophically slow.
AssumptionThe team initially blamed the printer driver or network latency, because the code worked fine in staging with 200 test invoices. The slowness only appeared in production with real volume.
Root causeThe print queue was implemented as a plain Python list with list.pop(0) for dequeue. Each pop(0) operation shifts every remaining element one position left in memory — O(n). With 80,000 invoices, each dequeue cost 80,000 memory operations. Processing 10,000 invoices in a loop meant 10,000 * 80,000 = 800 million shift operations. The code was correct in logic but had an O(n^2) effective time complexity for batch processing.
Fix1. Replaced the plain list with collections.deque. Dequeue changed from list.pop(0) at O(n) to deque.popleft() at O(1). Processing 80,000 invoices dropped from 45 seconds to 12 milliseconds. 2. Added a performance regression test that enqueues 100,000 items and verifies dequeue completes in under 100ms. 3. Added a lint rule that flags list.pop(0) and list.insert(0, ...) as potential performance issues. 4. Documented the deque requirement in the team's Python performance guidelines.
Key Lesson
A plain list as a Queue is correct but has O(n) dequeue. This is invisible at small scale and catastrophic at large scale.Performance bugs that only appear at production volume are the hardest to catch — staging with 200 items will never reveal an O(n) vs O(1) difference.collections.deque is the correct Python Queue implementation. Use it from the start, not as a fix after a production incident.Lint rules that flag list.pop(0) and list.insert(0, ...) prevent this class of bug from entering production.
Production Debug GuideSymptom-driven investigation paths for ordering and performance failures.
Queue processing latency grows linearly with queue size.Check if dequeue uses list.pop(0). If yes, replace with collections.deque and popleft(). The O(n) shift cost is the cause.
Stack returns items in the wrong order (FIFO instead of LIFO).Check if the code uses append() with pop(0) instead of append() with pop(). Mixing ends breaks the LIFO contract.
IndexError on empty stack/queue with a confusing traceback.Add is_empty() guards before pop/dequeue/peek operations. Raise descriptive exceptions instead of letting bare list.pop() fail.
Queue processes items out of order in a multi-threaded system.Check if the queue is shared across threads without locking. A plain list is not thread-safe. Use queue.Queue (stdlib) or multiprocessing.Queue for concurrent access.
Memory grows unboundedly — queue never shrinks.Check if deque maxlen is set. Without maxlen, a deque grows until memory is exhausted. Set maxlen to bound the queue size and let old items drop automatically.

Stack and Queue are the two simplest ordered data structures, yet they underpin nearly every system that processes work in a defined sequence. Browser back-buttons, print spoolers, task schedulers, compiler parsers, BFS/DFS traversals — all rely on one of these two structures.

The key insight: both are wrappers around a plain list that impose access restrictions. A Stack only touches the right end. A Queue adds to the right and removes from the left. These restrictions are the feature — they prevent accidental ordering bugs that a free-form list would allow.

A common misconception is that a Python list works equally well for both. It does not. Stack operations (append/pop) are both O(1). Queue operations require removing from the front (pop(0)), which is O(n) because Python shifts every element left in memory. For production queues, collections.deque is the correct choice.

The Stack — Last In, First Out Using a Python List

A Stack enforces one golden rule: the last item you put in is always the first item you take out. Computer scientists call this LIFO — Last In, First Out. Think of it like the undo history in a text editor. Every change you make gets pushed onto the stack. When you hit Ctrl+Z, the most recent change is popped off and reversed. You can never undo something from three steps ago without undoing the two steps in front of it first.

Python's list is a natural fit for a Stack because appending to the end is O(1) — it's blindingly fast. Removing from the end with pop() is also O(1). So both the core Stack operations — push and pop — cost basically nothing in time.

The key discipline is that you only ever touch one end of the list: the right end (the top of the stack). The moment you start inserting or removing from the middle or the left, you've broken the Stack contract and introduced bugs that will be very hard to trace.

browser_history_stack.py · PYTHON
1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556
# A Stack implemented with a Python list.
# Real-world scenario: browser back-button history.

class BrowserHistoryStack:
    def __init__(self):
        # The list acts as our stack storage.
        # The RIGHT end of the list is the TOP of the stack.
        self._history = []

    def push(self, url: str) -> None:
        """Visit a new page — push the URL onto the top of the stack."""
        self._history.append(url)  # append() is O(1) — always adds to the right end
        print(f"  Visited: {url}")

    def pop(self) -> str:
        """Go back — remove and return the most recently visited page."""
        if self.is_empty():
            raise IndexError("No history to go back to — stack is empty")
        previous_page = self._history.pop()  # pop() with no argument removes from the RIGHT end — O(1)
        print(f"  Going back to: {self._history[-1] if self._history else 'Start page'}")
        return previous_page

    def peek(self) -> str:
        """See the current page without removing it."""
        if self.is_empty():
            raise IndexError("Stack is empty — no current page")
        return self._history[-1]  # -1 index always gives us the top of the stack

    def is_empty(self) -> bool:
        return len(self._history) == 0

    def size(self) -> int:
        return len(self._history)

    def __repr__(self) -> str:
        # Display the stack so the top is on the RIGHT (most intuitive for lists)
        return f"BrowserHistoryStack({self._history}) <- TOP"


# --- Let's simulate a browsing session ---
history = BrowserHistoryStack()

history.push("https://google.com")
history.push("https://thecodeforge.io")
history.push("https://thecodeforge.io/python-stacks")

print(f"\nCurrent stack: {history}")
print(f"Currently on: {history.peek()}")
print(f"Stack size: {history.size()}")

print("\n-- Pressing back twice --")
history.pop()
history.pop()

print(f"\nCurrent stack: {history}")
print(f"Currently on: {history.peek()}")
▶ Output
Visited: https://google.com
Visited: https://thecodeforge.io
Visited: https://thecodeforge.io/python-stacks

Current stack: BrowserHistoryStack(['https://google.com', 'https://thecodeforge.io', 'https://thecodeforge.io/python-stacks']) <- TOP
Currently on: https://thecodeforge.io/python-stacks
Stack size: 3

-- Pressing back twice --
Going back to: https://thecodeforge.io
Going back to: https://google.com

Current stack: BrowserHistoryStack(['https://google.com']) <- TOP
Currently on: https://google.com
💡Pro Tip: Guard Every Pop and Peek
  • list.pop() on an empty list raises IndexError with a generic message.
  • A wrapped class can raise a domain-specific error: 'Cannot undo — no history'
  • The wrapper also prevents accidental middle-element access via del list[i].
  • Production code should always wrap raw data structures in domain classes.
📊 Production Insight
A Stack that only exposes push, pop, peek, and is_empty is a disciplined wrapper around a list. The restriction is the feature — it prevents bugs like accidentally removing the wrong element or inserting in the middle. In production, always wrap the list in a class that enforces the Stack contract. A raw list is too permissive.
🎯 Key Takeaway
A Stack is a list with a discipline: only touch the right end. append() is push, pop() is pop — both O(1). The restriction prevents ordering bugs. Always wrap in a domain class.
When to Use a Stack
IfProblem requires processing the most recent item first (undo, redo, DFS, backtracking)
UseUse a Stack. LIFO order matches the access pattern.
IfProblem requires processing items in arrival order (task scheduling, BFS)
UseUse a Queue. FIFO order matches the access pattern.
IfProblem requires accessing items in arbitrary order
UseNeither Stack nor Queue. Use a list, set, or map depending on access pattern.
IfProblem involves matching nested structures (brackets, HTML tags, expression evaluation)
UseUse a Stack. The LIFO property naturally matches the nesting order.

The Queue — First In, First Out Using a Python List (and Why Naive Lists Are Slow)

A Queue enforces the opposite rule: the first item in is the first item out — FIFO. Think of tickets in a support system. The customer who raised a ticket first should get helped first. Nobody skips the line.

Here's where Python beginners hit a wall. You might assume you can just use list.insert(0, item) to add to the front and list.pop() to remove from the back — or append() to add to the back and pop(0) to remove from the front. Both approaches work correctly but the pop(0) or insert(0, ...) operations are O(n). Every time you remove from the front of a Python list, Python has to shift every remaining element one position to the left in memory. On a list with 100,000 items, that's 100,000 memory operations for a single dequeue. This kills performance.

For a true production Queue, Python's standard library gives you collections.deque (double-ended queue) which solves this in O(1). But understanding the list-based version first is essential — it's the foundation, and it's what interviewers test you on to see if you understand the underlying cost.

support_ticket_queue.py · PYTHON
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081
# A Queue implemented with a Python list.
# Real-world scenario: customer support ticket processing.
# We'll also benchmark the naive approach to show WHY deque exists.

import time

class SupportTicketQueue:
    def __init__(self):
        # The list acts as our queue storage.
        # RIGHT end = back of queue (where new tickets are added).
        # LEFT end  = front of queue (where tickets are processed next).
        self._tickets = []

    def enqueue(self, ticket_id: str) -> None:
        """Add a new support ticket to the back of the queue."""
        self._tickets.append(ticket_id)  # append() is O(1) — fast
        print(f"  Ticket {ticket_id} added to queue")

    def dequeue(self) -> str:
        """Process the next ticket — remove from the front of the queue."""
        if self.is_empty():
            raise IndexError("No tickets in queue — nothing to process")
        # pop(0) removes the FIRST element — but this is O(n) on a plain list!
        # Every element shifts left by one position in memory.
        # This is fine for small queues; use collections.deque for large ones.
        next_ticket = self._tickets.pop(0)
        print(f"  Processing ticket: {next_ticket}")
        return next_ticket

    def peek(self) -> str:
        """See which ticket is next without processing it."""
        if self.is_empty():
            raise IndexError("Queue is empty")
        return self._tickets[0]  # Front of the queue is always index 0

    def is_empty(self) -> bool:
        return len(self._tickets) == 0

    def size(self) -> int:
        return len(self._tickets)

    def __repr__(self) -> str:
        return f"FRONT -> {self._tickets} <- BACK"


# --- Simulate a support queue ---
ticket_queue = SupportTicketQueue()

ticket_queue.enqueue("TKT-001")  # First customer — should be helped first
ticket_queue.enqueue("TKT-002")
ticket_queue.enqueue("TKT-003")

print(f"\nQueue state: {ticket_queue}")
print(f"Next up: {ticket_queue.peek()}")
print(f"Tickets waiting: {ticket_queue.size()}")

print("\n-- Processing tickets in order --")
ticket_queue.dequeue()  # TKT-001 goes first — FIFO in action
ticket_queue.dequeue()  # TKT-002 goes second

print(f"\nQueue state: {ticket_queue}")

# --- Now let's see the O(n) cost of pop(0) on a large list ---
print("\n-- Performance comparison: pop(0) vs pop() --")

large_list_front = list(range(500_000))  # 500,000 items
large_list_back  = list(range(500_000))

start = time.perf_counter()
for _ in range(10_000):
    large_list_front.pop(0)  # Removing from the FRONT — O(n) each time
elapsed_front = time.perf_counter() - start

start = time.perf_counter()
for _ in range(10_000):
    large_list_back.pop()    # Removing from the BACK — O(1) each time
elapsed_back = time.perf_counter() - start

print(f"  pop(0) — removing from front: {elapsed_front:.4f}s")
print(f"  pop()  — removing from back:  {elapsed_back:.4f}s")
print(f"  pop(0) is roughly {elapsed_front / elapsed_back:.1f}x slower")
▶ Output
Ticket TKT-001 added to queue
Ticket TKT-002 added to queue
Ticket TKT-003 added to queue

Queue state: FRONT -> ['TKT-001', 'TKT-002', 'TKT-003'] <- BACK
Next up: TKT-001
Tickets waiting: 3

-- Processing tickets in order --
Processing ticket: TKT-001
Processing ticket: TKT-002

Queue state: FRONT -> ['TKT-003'] <- BACK

-- Performance comparison: pop(0) vs pop() --
pop(0) — removing from front: 0.3821s
pop() — removing from back: 0.0008s
pop(0) is roughly 477.6x slower
⚠ Watch Out: pop(0) Is a Performance Trap
Never use a plain list as a Queue in performance-sensitive code. The benchmark above shows pop(0) can be 400-500x slower than pop() on large lists. Switch to collections.deque — it's designed for O(1) appends and pops from both ends, making it the correct Queue implementation in Python.
📊 Production Insight
The O(n) pop(0) cost is invisible at small scale and catastrophic at large scale. A queue with 1,000 items feels instant. The same code with 100,000 items freezes. This is the most common Python performance bug in queue-based systems — the code is correct, the logic is sound, but the data structure choice is wrong. Always use deque for queues. If you see list.pop(0) in a loop, it is a bug.
🎯 Key Takeaway
A plain list as a Queue is correct but has O(n) dequeue. The pop(0) shift cost is invisible at small scale and catastrophic at production scale. Use collections.deque for any real queue.
Choosing the Right Queue Implementation
IfQueue size < 1,000 items, single-threaded
UsePlain list with pop(0) works. Correct but not optimal.
IfQueue size > 1,000 items, single-threaded
UseUse collections.deque. popleft() is O(1) vs pop(0) at O(n).
IfMulti-threaded producer-consumer
UseUse queue.Queue (stdlib). It provides thread-safe put/get with blocking.
IfBounded queue (must not exceed N items)
UseUse deque(maxlen=N). When full, appendleft() drops the oldest item automatically.
IfPriority-based processing (not FIFO)
UseNeither Stack nor Queue. Use heapq (min-heap) for priority queue semantics.

When to Use a Stack vs Queue — Real Patterns You'll Actually Encounter

Knowing the mechanics is only half the battle. The real skill is recognising which structure fits the problem in front of you. Here's a reliable mental model: if your problem is about reversing, unwinding, or backtracking — use a Stack. If your problem is about maintaining order of arrival and processing things fairly — use a Queue.

Stacks show up in: undo/redo systems, function call management (the call stack is literally a stack), balanced bracket validation in parsers, depth-first graph traversal, and expression evaluation in calculators.

Queues show up in: task scheduling, print spoolers, breadth-first graph traversal, request handling in web servers, rate limiters, and any producer-consumer pipeline where you want to process work in arrival order.

The example below shows bracket validation — a Stack-based algorithm that appears constantly in coding interviews and real compilers. It's a perfect illustration because the stack's LIFO property is exactly what lets you match the most recently opened bracket first.

bracket_validator_stack.py · PYTHON
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354
# Real-world Stack use case: validating balanced brackets.
# This exact logic is used in code editors, compilers, and JSON parsers.

def is_balanced(expression: str) -> bool:
    """
    Returns True if all brackets in the expression are correctly matched.
    Uses a Stack to track open brackets as we scan left to right.
    """
    # Map each closing bracket to its expected opening bracket
    matching_open = {')': '(', ']': '[', '}': '{'}
    closing_brackets = set(matching_open.keys())
    opening_brackets = set(matching_open.values())

    bracket_stack = []  # Our stack — stores unmatched opening brackets

    for character in expression:
        if character in opening_brackets:
            # We found an opener — push it onto the stack and move on
            bracket_stack.append(character)

        elif character in closing_brackets:
            # We found a closer — the stack top MUST be its matching opener
            if not bracket_stack:
                # Closer with nothing on the stack — unmatched closer
                return False

            top_of_stack = bracket_stack.pop()  # Pop the most recent opener

            if top_of_stack != matching_open[character]:
                # Top of stack doesn't match this closer — mismatched pair
                return False

    # If the stack is empty, every opener was matched and closed
    # If not empty, some openers were never closed
    return len(bracket_stack) == 0


# --- Test the validator ---
test_cases = [
    ("({[]})",          True),   # Perfectly nested — all matched
    ("([)]",            False),  # Wrong order — square closed before round
    ("{[()()]}",        True),   # Multiple levels of nesting — all matched
    ("(((",             False),  # All openers, no closers
    (")))",             False),  # All closers, no openers
    ("def func(a[0]):", True),   # Real code-like expression
    ("{name: [1,2,3]}", True),   # JSON-like structure
]

print("Bracket Validation Results:")
print("-" * 45)
for expression, expected in test_cases:
    result = is_balanced(expression)
    status = "PASS" if result == expected else "FAIL"
    print(f"  [{status}] '{expression}' -> {result}")
▶ Output
Bracket Validation Results:
---------------------------------------------
[PASS] '({[]})' -> True
[PASS] '([)]' -> False
[PASS] '{[()()]}' -> True
[PASS] '(((' -> False
[PASS] ')))' -> False
[PASS] 'def func(a[0]):' -> True
[PASS] '{name: [1,2,3]}' -> True
🔥Interview Gold: Bracket Validation
  • When you see a closing bracket, the matching opener is always the most recently opened one.
  • A Stack naturally gives you the most recently added item — that is exactly what you need.
  • A Queue would give you the earliest opener, which is wrong for nested structures.
  • This is why LIFO is not just a preference — it is the algorithmic requirement.
📊 Production Insight
Bracket validation is used in real compilers, JSON parsers, HTML validators, and code editors. The Stack is not a toy data structure here — it is the core algorithm. In production, a parser that fails on mismatched brackets can cause silent data corruption (malformed JSON accepted) or security vulnerabilities (injection through unclosed tags). The Stack's LIFO property is what guarantees correct nesting detection.
🎯 Key Takeaway
Stack = backtracking, unwinding, matching (LIFO). Queue = scheduling, ordering, fair processing (FIFO). Bracket validation is the canonical Stack interview problem — the LIFO property is the algorithm, not just the implementation.
Stack vs Queue: Decision Flowchart
IfNeed to process most recent item first (undo, backtracking, matching)
UseUse Stack. LIFO matches the access pattern.
IfNeed to process items in arrival order (scheduling, BFS, fair processing)
UseUse Queue. FIFO matches the access pattern.
IfNeed to match nested structures (brackets, tags, expressions)
UseUse Stack. LIFO naturally matches the nesting depth.
IfNeed to explore all neighbors at current depth before going deeper
UseUse Queue for BFS. Process all items at level N before level N+1.
IfNeed to explore as deep as possible before backtracking
UseUse Stack for DFS. Process the most recently discovered node first.
🗂 Stack vs Queue: Complete Comparison
Trade-offs, performance, and use cases for both ordered data structures.
Feature / AspectStack (LIFO)Queue (FIFO)
Order principleLast In, First OutFirst In, First Out
Add operation namepush — append() to O(1)enqueue — append() to O(1)
Remove operation namepop — list.pop() to O(1)dequeue — list.pop(0) to O(n) warning
Which end is active?Only the right/top endAdd to right, remove from left
Best Python implementationlist (built-in)collections.deque (standard lib)
Typical use casesUndo, call stack, DFS, parsersTask queues, BFS, scheduling
Peek operation costO(1) — list[-1]O(1) — list[0]
Risk with plain listNone — both ops are O(1)pop(0) is O(n) — use deque instead
Real-world analogyStack of platesCoffee shop line
Thread safetyNot thread-safe (raw list)Not thread-safe (raw list or deque)
Thread-safe alternativeN/A (rarely shared across threads)queue.Queue (stdlib) with put/get blocking
Bounded size supportNot built-in (check manually)deque(maxlen=N) — auto-drops oldest

🎯 Key Takeaways

  • A Stack uses LIFO order — list.append() to push and list.pop() to pop, both O(1). The right end of the list is the top. Never touch the left end.
  • A plain Python list-based Queue is correct but slow — list.pop(0) is O(n). For any production Queue, use collections.deque with appendleft()/popleft() or append()/popleft() for true O(1) performance.
  • Reach for a Stack when your problem involves backtracking, unwinding, or reversing (undo systems, DFS, bracket matching). Reach for a Queue when order of arrival matters (task scheduling, BFS, rate limiting).
  • The bracket validation algorithm is a must-know Stack interview pattern — practise explaining WHY the LIFO property is what makes it work, not just how to code it.

⚠ Common Mistakes to Avoid

    Using pop(0) for a Queue in a loop on large datasets
    Symptom

    code works correctly but becomes noticeably slow as the list grows, eventually freezing on datasets above ~50,000 items —

    Fix

    replace list.pop(0) with collections.deque and use popleft() instead, which is O(1) and designed exactly for this purpose.

    Confusing which end is the 'top' of the Stack
    Symptom

    the Stack works but the items come out in the wrong order because you're mixing append() with pop(0), or insert(0,...) with pop()

    Fix

    commit to one convention: always use list.append() to push and list.pop() (no argument) to pop. The right end of the list is always the top of the stack.

    Not guarding pop() and peek() on empty structures
    Symptom

    IndexError with a confusing traceback pointing inside your class rather than to the calling code —

    Fix

    always check is_empty() before any removal or peek operation and raise a descriptive exception yourself, e.g. raise IndexError('Cannot pop from an empty stack') so the error message tells you exactly what went wrong.

    Using a plain list as a shared queue across threads
    Symptom

    items processed out of order, duplicate processing, or silent data loss —

    Fix

    a plain list is not thread-safe. Use queue.Queue (stdlib) for thread-safe producer-consumer patterns. It provides put() and get() with optional blocking and timeouts.

    Not setting maxlen on deque for bounded queues
    Symptom

    queue grows unboundedly under load, consuming all available memory —

    Fix

    set deque(maxlen=N). When the deque is full, appendleft() automatically drops the oldest item from the right end. This bounds memory usage.

    Treating Stack and Queue as interchangeable
    Symptom

    algorithm produces wrong results because the processing order is wrong —

    Fix

    Stack and Queue are not interchangeable. LIFO and FIFO produce fundamentally different traversal orders. BFS with a Stack gives DFS. Undo with a Queue gives redo of the oldest change, not the most recent.

Interview Questions on This Topic

  • QWhat is the time complexity of enqueue and dequeue when you implement a Queue using a plain Python list, and how would you fix any performance issue you find? (Tests whether you know pop(0) is O(n) and that collections.deque is the correct solution.)
  • QCan you implement a Stack that supports push, pop, peek, and a get_minimum() operation — all in O(1) time? (Classic interview problem — the trick is maintaining a second 'min stack' in parallel that tracks the minimum at every level.)
  • QYou have a Stack. Using only push and pop operations on that Stack (no extra arrays), how would you reverse the order of all its elements? (Tricky follow-up — answer requires a recursive approach or using a second temporary stack, and tests whether you truly understand LIFO.)
  • QHow would you implement a Queue using two Stacks? Walk me through the amortized O(1) dequeue approach. (Tests understanding of both structures and amortized analysis.)
  • QWhat is the difference between collections.deque and queue.Queue in Python? When would you use each? (Tests knowledge of thread safety — deque is not thread-safe; Queue provides put/get with blocking.)
  • QExplain why BFS uses a Queue and DFS uses a Stack. What happens if you swap them? (Tests fundamental understanding — swapping gives wrong traversal order.)

Frequently Asked Questions

Should I use a Python list or collections.deque to implement a Queue?

Use collections.deque for any real Queue. A plain list works correctly but list.pop(0) — the dequeue operation — is O(n) because Python shifts every remaining element in memory. deque.popleft() is O(1). For learning or tiny datasets the list is fine; for anything in production, use deque.

What is the difference between a Stack and a Queue in Python?

A Stack is LIFO — the last item you add is the first one you remove, like a stack of plates. A Queue is FIFO — the first item you add is the first one you remove, like a waiting line. Both can be built on a Python list, but the direction you add and remove items is opposite.

Why does Python not have a built-in Stack class?

Because a plain Python list already behaves as a perfect Stack out of the box. list.append() is push and list.pop() is pop — both are O(1). There's no need for a separate class. If you want a formal interface with named methods and safety guards, you wrap the list in your own class, which is exactly what the examples in this article do.

How do you implement a Queue with two Stacks?

Use one stack for enqueue (push) and another for dequeue. When the dequeue stack is empty, pop all items from the enqueue stack and push them onto the dequeue stack — this reverses the order, giving FIFO. Dequeue is amortized O(1) because each item is moved at most once.

Is collections.deque thread-safe?

No. collections.deque provides atomic appends and pops (the GIL protects single operations), but compound operations like 'if deque: deque.popleft()' are not atomic. For thread-safe queues, use queue.Queue from the standard library, which provides blocking put() and get() methods with proper synchronization.

When should I use deque(maxlen=N)?

Use maxlen when you need a bounded queue or a rolling window. When the deque is full, appending a new item automatically drops the oldest item from the opposite end. This is useful for rate limiters (track last N requests), sliding windows, and bounded task queues.

🔥
Naren Founder & Author

Developer and founder of TheCodeForge. I built this site because I was tired of tutorials that explain what to type without explaining why it works. Every article here is written to make concepts actually click.

← PreviousTreeMap and TreeSet in JavaNext →Iterator and ListIterator in Java
Forged with 🔥 at TheCodeForge.io — Where Developers Are Forged