JavaScript Event Loop Explained: Internals, Call Stack, and Task Queues
JavaScript runs in a single thread. That sentence alone has confused, burned, and humbled more developers than almost any other fact in the language. If you've ever wondered why a long-running loop freezes your entire UI, why a setTimeout with 0ms delay doesn't run immediately, or why some async code resolves before other async code even though both were kicked off at the same time — the event loop is the answer to every single one of those questions. It's not optional knowledge. It's the engine underneath everything you write.
The event loop exists to solve a fundamental paradox: the browser needs to do I/O, render frames, handle user clicks, and run your code — all seemingly at once — but it only has one JavaScript thread to work with. Without a structured mechanism for ordering tasks, you'd either block the UI while waiting for a network request, or you'd have no reliable way to reason about when your callbacks fire. The event loop is that mechanism — a precise, spec-defined algorithm that decides what runs next and in what order.
By the end of this article you'll be able to look at any async JavaScript code and predict the exact output order before running it. You'll understand why Promises resolve before setTimeout callbacks even when both are already settled, why queueMicrotask is a sharp tool that can starve your render pipeline, and how to write async-heavy production code without accidentally creating jank or hard-to-reproduce race conditions. Let's go deep.
The Call Stack, Web APIs, and the Two Task Queues You Must Know
The event loop isn't a single data structure — it's a coordination algorithm across four distinct components. Nail these four and everything else falls into place.
Call Stack: A LIFO stack where synchronous execution frames live. When you call a function, a frame is pushed. When it returns, it's popped. The engine can only execute what's on top of this stack. If it's not empty, nothing else runs. This is why an infinite synchronous loop freezes your tab — the stack never empties.
Web APIs (or Node.js C++ bindings): When you call setTimeout, fetch, or add an event listener, the actual waiting work is handed off to the browser's C++ runtime — completely outside the JS engine. Your JS thread is free immediately. These environments call back into JS when their work is done.
Macrotask Queue (Task Queue): This is where callbacks from setTimeout, setInterval, setImmediate (Node), I/O callbacks, and UI events land when they're ready. The event loop picks one macrotask per loop iteration.
Microtask Queue (Job Queue): This is where Promise .then/.catch/.finally callbacks and queueMicrotask() callbacks go. After every single task — and after every individual microtask — the engine drains the entire microtask queue before touching the macrotask queue again. This is the crucial difference most developers miss.
The event loop algorithm itself is: (1) Dequeue and run one macrotask. (2) Drain every microtask in the microtask queue, including any newly added during drainage. (3) Render if needed (browser). (4) Go back to step 1.
// ─── Demonstrating the exact execution order of the event loop ─── console.log('1. Script start — synchronous, runs immediately on call stack'); setTimeout(() => { // This callback is a MACROTASK — handed to the Web API timer, // queued in the macrotask queue once the delay expires. // Even with 0ms, it CANNOT run until the call stack is clear // AND all current microtasks are drained. console.log('4. setTimeout callback — macrotask, runs last'); }, 0); Promise.resolve('resolved value').then((value) => { // .then() schedules a MICROTASK. // Microtasks run after the current synchronous block finishes // but BEFORE the next macrotask is dequeued. console.log('3. Promise .then() — microtask, runs before setTimeout'); }); queueMicrotask(() => { // queueMicrotask is the raw microtask API — same queue as Promises. // Runs in FIFO order within the microtask queue. console.log('3a. queueMicrotask — also a microtask, after promise .then'); }); console.log('2. Script end — still synchronous, still on call stack'); // ─── What actually happens step by step ─── // Step 1: Call stack runs synchronously: logs 1, queues setTimeout // callback as macrotask, queues .then as microtask, // queues queueMicrotask callback, logs 2. // Step 2: Call stack is now empty. Event loop checks microtask queue. // Drains it completely: logs 3, then 3a. // Step 3: Microtask queue empty. Event loop dequeues one macrotask. // Runs setTimeout callback: logs 4.
2. Script end — still synchronous, still on call stack
3. Promise .then() — microtask, runs before setTimeout
3a. queueMicrotask — also a microtask, after promise .then
4. setTimeout callback — macrotask, runs last
Async/Await Under the Hood — It's Just Promise Microtasks in Disguise
When developers learn async/await, they often mentally model it as 'synchronous-looking async code' and stop there. That's incomplete. Every await expression is syntactic sugar over a Promise .then() — which means every suspension point is a microtask boundary. Understanding this is the difference between writing predictable async code and shipping subtle ordering bugs.
When the JS engine hits an await, it pauses execution of that async function, wraps the rest of the function body as a microtask callback, and returns control to the caller. The caller continues running synchronously. Only after the call stack clears and the awaited Promise resolves does the rest of the async function resume — as a microtask.
There's an important historical note here. Prior to V8 v7.2 (Node 12), await introduced two extra microtask ticks compared to an equivalent hand-written Promise chain. This was a spec interpretation issue. The spec was updated and modern engines now resolve await in one tick for already-resolved Promises. This matters if you're debugging ordering issues on older Node versions.
The takeaway for production code: don't assume that just because two await expressions start at the 'same time' they'll finish interleaved. Each await is a checkpoint where other microtasks — and potentially macrotasks and renders — can slip in. This is especially relevant in high-frequency event handlers like mousemove or WebSocket onmessage.
// ─── Proving async/await is microtasks all the way down ─── async function fetchUserProfile() { console.log('B. fetchUserProfile started — before first await'); // This await suspends fetchUserProfile and returns control to the // caller. The rest of the function body is scheduled as a microtask // once this Promise resolves. const userId = await Promise.resolve(42); // Everything after an await runs as a microtask callback. console.log(`D. fetchUserProfile resumed — userId is ${userId}`); const userRecord = await Promise.resolve({ id: userId, name: 'Amara' }); // Each await is a new microtask boundary — control could theoretically // yield to other queued microtasks between each one. console.log(`F. fetchUserProfile resolved — user: ${userRecord.name}`); return userRecord; } console.log('A. Synchronous code before calling fetchUserProfile'); // Calling an async function runs it synchronously up to the first await, // then returns a pending Promise. Execution continues here. const profilePromise = fetchUserProfile(); console.log('C. Synchronous code AFTER calling fetchUserProfile'); // At this point the call stack finishes its synchronous work. // Microtask queue drains: fetchUserProfile resumes at 'D', // then hits the second await, then resumes at 'F'. profilePromise.then((resolvedUser) => { // This .then fires AFTER all awaits inside fetchUserProfile complete. console.log(`G. profilePromise .then — final user: ${resolvedUser.name}`); }); console.log('E. This runs before G because profilePromise is still pending here'); // ─── Key insight ─── // Notice 'E' logs before 'F' even though E is written after the .then. // Why? Because when we reach console.log('E'), fetchUserProfile is still // suspended at its second await. 'E' runs synchronously now. // Only when the call stack clears does the microtask queue drain, // resuming fetchUserProfile ('F'), then firing the .then ('G').
B. fetchUserProfile started — before first await
C. Synchronous code AFTER calling fetchUserProfile
E. This runs before G because profilePromise is still pending here
D. fetchUserProfile resumed — userId is 42
F. fetchUserProfile resolved — user: Amara
G. profilePromise .then — final user: Amara
The Browser Rendering Pipeline and Where It Fits in the Loop
The browser's event loop has one more player that Node.js doesn't: the rendering pipeline. Between macrotask iterations, the browser MAY run a render step — recalculate styles, do layout, paint, and composite. 'May' is important: the browser renders at roughly 60fps (one frame every ~16.6ms), so it only renders when a frame is due and when the call stack is clear.
This means: if your macrotask takes longer than 16.6ms, you drop a frame. The user sees jank. This is the root cause of most UI performance bugs, and understanding the event loop makes the fix obvious: break your long task into chunks using setTimeout(chunk, 0) or the newer scheduler.postTask() API to yield between chunks and let the browser render.
There's also requestAnimationFrame (rAF). Its callbacks are NOT regular macrotasks. They're scheduled to run right before the browser's next render step, after the macrotask and after microtasks drain but before paint. This makes rAF the correct tool for any JS that reads or writes to the DOM in sync with the display refresh rate. Using setTimeout for animations is a classic mistake — it doesn't align with the display refresh cycle and causes inconsistent frame timing.
The requestIdleCallback API sits at the other end: it fires during browser idle time — when there are no pending tasks and no frame is due. Use it for low-priority background work like analytics or prefetching.
// ─── Demonstrating render pipeline interaction with the event loop ─── // BAD PATTERN: Blocking the main thread — drops frames function processLargeDatasetBlocking(dataItems) { // If dataItems has 100,000 entries this runs for potentially // hundreds of milliseconds — no render step can fire during this. // The UI is completely frozen. const processedResults = []; for (const item of dataItems) { processedResults.push(item * 2); // Simulating CPU work } return processedResults; } // GOOD PATTERN: Chunked processing — yields to the event loop function processLargeDatasetInChunks(dataItems, chunkSize = 1000) { return new Promise((resolve) => { const processedResults = []; let currentIndex = 0; function processNextChunk() { const endIndex = Math.min(currentIndex + chunkSize, dataItems.length); // Process one chunk synchronously for (let i = currentIndex; i < endIndex; i++) { processedResults.push(dataItems[i] * 2); } currentIndex = endIndex; if (currentIndex < dataItems.length) { // setTimeout(fn, 0) enqueues a macrotask — the browser CAN // run a render step before picking it up. This is how we // voluntarily yield control without blocking the UI. setTimeout(processNextChunk, 0); } else { resolve(processedResults); } } processNextChunk(); }); } // CORRECT PATTERN: DOM animation — use rAF, not setTimeout function animateProgressBar(progressBarElement, targetWidth) { let currentWidth = 0; function animationStep(timestamp) { // requestAnimationFrame aligns execution with the display // refresh cycle. The timestamp is a DOMHighResTimeStamp. currentWidth = Math.min(currentWidth + 2, targetWidth); // Batching DOM writes inside rAF avoids forced synchronous // layouts (layout thrashing). progressBarElement.style.width = `${currentWidth}px`; if (currentWidth < targetWidth) { // Re-register for the NEXT frame, not a fixed timeout. requestAnimationFrame(animationStep); } } requestAnimationFrame(animationStep); } // Usage demonstrating the difference const largeDataset = Array.from({ length: 50000 }, (_, index) => index); console.log('Starting chunked processing — UI stays responsive'); processLargeDatasetInChunks(largeDataset, 5000) .then((results) => { console.log(`Processing complete — first result: ${results[0]}, last result: ${results[results.length - 1]}`); console.log(`Total items processed: ${results.length}`); }); console.log('This logs immediately — chunked work is non-blocking');
This logs immediately — chunked work is non-blocking
Processing complete — first result: 0, last result: 99998
Total items processed: 50000
Node.js Event Loop Phases — It's Not the Same Beast as the Browser
Node.js uses libuv under the hood, which implements its own event loop with distinct phases. Knowing these phases is critical for backend developers — the ordering rules are different from the browser, and process.nextTick behaves in a way that surprises almost everyone the first time.
The Node.js event loop has six phases executed in order: Timers (setTimeout/setInterval callbacks), Pending Callbacks (I/O callbacks deferred from the previous iteration), Idle/Prepare (internal use), Poll (new I/O events; blocks here if queue is empty and no timers are pending), Check (setImmediate callbacks), and Close Callbacks (socket close events).
Here's the critical production gotcha: process.nextTick() callbacks don't run in any of those six phases. They run in a 'nextTick queue' that's drained between every phase transition and after every C++ to JS boundary crossing. This makes process.nextTick even higher priority than Promises — it fires before .then microtasks. Abusing process.nextTick recursively will starve your I/O callbacks and can cause your server to stop responding to new requests.
setImmediate vs setTimeout(fn, 0) in Node: inside an I/O callback, setImmediate always fires before setTimeout. Outside an I/O callback, the order is non-deterministic and depends on process performance. Never rely on their relative ordering unless you're inside an I/O callback.
// ─── Node.js event loop phase ordering demonstration ─── // Run this with: node NodeEventLoopPhases.js const { readFile } = require('fs'); // process.nextTick fires between phase transitions — // BEFORE Promise microtasks, BEFORE setImmediate, BEFORE setTimeout. process.nextTick(() => { console.log('2. process.nextTick — highest priority async callback'); }); // Promise .then fires in the microtask queue — // after nextTick queue drains, before the event loop moves phases. Promise.resolve().then(() => { console.log('3. Promise .then — microtask, after nextTick'); }); // setTimeout with 0ms delay — goes to Timers phase. // In Node, 0ms is clamped to 1ms minimum by libuv. setTimeout(() => { console.log('5. setTimeout — Timers phase (non-deterministic vs setImmediate outside I/O)'); }, 0); // setImmediate — runs in the Check phase, AFTER the Poll phase. setImmediate(() => { console.log('5-alt. setImmediate — Check phase (non-deterministic vs setTimeout outside I/O)'); }); console.log('1. Synchronous code — runs first, always'); // ─── Inside an I/O callback, ordering is DETERMINISTIC ─── readFile(__filename, () => { // Inside an I/O callback we're in the Poll phase. // setImmediate (Check phase) ALWAYS fires before setTimeout (next Timers phase) // when scheduled from within I/O. setTimeout(() => { console.log('8. setTimeout inside I/O — Timers phase of NEXT loop iteration'); }, 0); setImmediate(() => { console.log('7. setImmediate inside I/O — ALWAYS before setTimeout here'); }); process.nextTick(() => { console.log('6. process.nextTick inside I/O — fires before both, between phases'); }); console.log('4. Synchronous code inside I/O callback'); }); // ─── Dangerous: recursive nextTick starves I/O ─── // DON'T DO THIS in production — it prevents the event loop // from ever advancing past the nextTick queue. /* function dangerousRecursiveNextTick() { process.nextTick(dangerousRecursiveNextTick); // Starves all I/O! } dangerousRecursiveNextTick(); */
2. process.nextTick — highest priority async callback
3. Promise .then — microtask, after nextTick
5. setTimeout — Timers phase (non-deterministic vs setImmediate outside I/O)
5-alt. setImmediate — Check phase (non-deterministic vs setTimeout outside I/O)
4. Synchronous code inside I/O callback
6. process.nextTick inside I/O — fires before both, between phases
7. setImmediate inside I/O — ALWAYS before setTimeout here
8. setTimeout inside I/O — Timers phase of NEXT loop iteration
| Feature / Aspect | Microtask Queue (Promises, queueMicrotask) | Macrotask Queue (setTimeout, setInterval, I/O) |
|---|---|---|
| Drain timing | After every task AND after every microtask — fully drained each time | One task dequeued per event loop iteration |
| Examples | Promise .then/.catch/.finally, queueMicrotask, async/await resumption | setTimeout, setInterval, setImmediate, I/O callbacks, UI events |
| Priority vs rendering | Runs BEFORE the browser render step | Browser MAY render between macrotasks |
| Starvation risk | HIGH — recursive microtasks block rendering and macrotasks indefinitely | LOW — one task per iteration, loop always advances |
| process.nextTick (Node) | Fires before Promise microtasks, between every phase | N/A — nextTick is its own separate queue in Node |
| Use case | Chaining async logic that must complete atomically before yielding | Deferring work to allow rendering, chunking CPU tasks, timers |
| Cancellable? | No — once queued, a microtask always runs | Yes — clearTimeout / clearInterval / AbortController |
| Spec location | WHATWG HTML spec — 'perform a microtask checkpoint' | WHATWG HTML spec — 'task queue' and 'event loop processing model' |
🎯 Key Takeaways
- The microtask queue (Promises, queueMicrotask) is fully drained after every single task AND after every microtask — this means one queued macrotask can never 'sneak in' between chained
.then()calls. - Every
awaitis a microtask boundary where JavaScript returns control to the caller — calling an async function always runs synchronously up to the firstawait, and the caller continues before the awaited value resolves. - In the browser, long synchronous tasks and unchecked microtask chains both block the rendering pipeline — use
setTimeout(fn, 0)orrequestAnimationFrameto yield to rendering; userequestIdleCallbackfor truly background work. - In Node.js,
process.nextTickis not in the six libuv event loop phases — it has its own queue that drains between every phase transition and fires before Promise microtasks; abusing it recursively starves I/O and can bring a server to a halt.
⚠ Common Mistakes to Avoid
- ✕Mistake 1: Assuming setTimeout(fn, 0) runs 'immediately after current code' — Symptom: developers use it to defer work and are surprised when Promise callbacks always jump ahead of it, causing unexpected ordering in UI updates or tests — Fix: Understand that setTimeout creates a macrotask and any pending microtasks (including all queued Promise callbacks) will fully drain before setTimeout fires. If you need to defer after Promises, chain another .then. If you explicitly want to yield to rendering, setTimeout(fn, 0) is correct — that's actually its most legitimate use case.
- ✕Mistake 2: Using async/await inside Array.forEach and expecting sequential execution — Symptom: code like
items.forEach(async (item) => { await processItem(item); })runs all iterations concurrently (not sequentially) and the forEach call itself returns before any item finishes processing — Fix: Usefor...ofwithawaitfor sequential processing, orPromise.all(items.map(async (item) => processItem(item)))for controlled concurrency. forEach does not await the returned Promise from its callback. - ✕Mistake 3: Blocking the event loop with a synchronous CPU-heavy operation in a Node.js server — Symptom: one expensive request (e.g., parsing a large JSON payload, running a crypto operation in pure JS, or a deep recursive algorithm) causes all other concurrent requests to hang and timeout — Fix: Offload CPU-bound work to Worker Threads (
worker_threadsmodule in Node 10.5+), use thecryptomodule's async APIs instead of sync variants (e.g.,crypto.pbkdf2notcrypto.pbkdf2Sync), or break work into chunks with setImmediate between chunks. Never do blocking work on the main thread in a server.
Interview Questions on This Topic
- QCan you explain why `console.log` inside a Promise `.then()` fires before `console.log` inside a `setTimeout(fn, 0)` callback, even when the Promise was created after the setTimeout? Walk me through the event loop mechanics step by step.
- QIn a Node.js HTTP server, a request handler does `await someHeavyComputation()` where `someHeavyComputation` is a CPU-bound function that takes 2 seconds and returns a resolved Promise. Will other incoming requests be handled during those 2 seconds? Why or why not?
- QWhat is the difference between `process.nextTick`, `Promise.resolve().then`, and `setImmediate` in Node.js, and in what order do they execute? Can you construct a code example where `setImmediate` fires before `setTimeout(fn, 0)` and explain exactly why?
Frequently Asked Questions
Why does my Promise callback run before my setTimeout even though I set the timeout to 0 milliseconds?
Because they use different queues with different priority levels. Your Promise callback lands in the microtask queue, which is fully drained after the current synchronous task completes. Your setTimeout callback lands in the macrotask queue, which is only picked up one-at-a-time on each new event loop iteration. So no matter how short your timeout is, all pending microtasks always run first.
Can the JavaScript event loop really handle thousands of concurrent requests in Node.js if it's single-threaded?
Yes, because 'concurrent' and 'parallel' are different things. Node.js handles concurrency by never blocking the thread — I/O operations (database queries, file reads, HTTP calls) are handed to libuv's thread pool or the OS's async I/O facilities. The JS thread is freed immediately and only resumes when the data is ready. As long as your JS code itself is non-blocking, one thread can efficiently juggle thousands of in-flight I/O operations. The problem only arises with CPU-bound work that keeps the thread busy synchronously.
What's the difference between the event loop in a browser and in Node.js?
The browser event loop is defined by the WHATWG HTML specification and includes a render pipeline step — the browser can repaint between macrotask iterations. Node.js uses libuv, which defines six named phases (Timers, Pending Callbacks, Idle/Prepare, Poll, Check, Close Callbacks) and adds process.nextTick as a special between-phases queue that doesn't exist in browsers. setImmediate is Node.js-only (Check phase), and browser-specific APIs like requestAnimationFrame and requestIdleCallback don't exist in Node.
Written and reviewed by senior developers with real-world experience across enterprise, startup and open-source projects. Every article on TheCodeForge is written to be clear, accurate and genuinely useful — not just SEO filler.