Node.js Interview Questions: Event Loop, Streams & Async Patterns Explained
- The Event Loop is what makes Node.js scalable; understand its phases to predict execution order.
- Streams prevent memory exhaustion by processing data in chunks rather than buffering.
- Node is single-threaded, but not single-process. Use the Cluster module for horizontal scaling on a single machine.
Imagine a single incredibly fast waiter at a restaurant. Instead of standing next to one table waiting for food to cook, they take the order, drop the ticket in the kitchen, then go serve other tables. When the kitchen rings the bell, they come back and deliver. That's Node.js — one thread, never idle, always handling the next task while async work finishes in the background. The 'bell' system is the event loop, and understanding it is what separates candidates who get hired from candidates who get 'we'll be in touch'.
Node.js powers Uber's real-time dispatch, Netflix's streaming APIs, and LinkedIn's mobile backend — not because it's the fastest language on the planet, but because it handles tens of thousands of simultaneous connections without spawning a new thread for each one. That's a fundamentally different mental model from Java or PHP, and interviewers test whether you truly understand it or just read the docs the night before.
The core problem Node.js solves is the C10K problem — handling 10,000 concurrent connections cheaply. Traditional servers block a thread per connection. Node's non-blocking I/O model means one process can juggle thousands of network requests because it never sits around waiting — it delegates I/O to the OS and moves on. Understanding this isn't trivia; it changes how you architect every feature you build.
By the end of this article you'll be able to explain the event loop's phase sequence under pressure, describe when streams beat buffering, write cluster code that actually uses all CPU cores, and sidestep the async/await traps that trip up even experienced developers. These aren't memory-game answers — they're patterns you'll use on day one of the job.
The Heart of Node: Mastering the Event Loop
To master Node.js, you must stop thinking linearly. The Event Loop isn't just a 'loop'; it's a multi-phase cycle managed by libuv. When an interviewer asks 'What is the order of execution?', they are looking for specific phases: Timers, Pending Callbacks, Idle/Prepare, Poll, Check, and Close Callbacks.
A common senior-level trick question is the difference between setImmediate() and process.nextTick(). While setImmediate() is designed to execute in the 'Check' phase after the poll phase completes, process.nextTick() isn't technically part of the event loop at all—it fires immediately after the current operation completes, before the loop moves to the next phase. If you abuse nextTick, you can actually starve the event loop, preventing I/O from ever happening.
// TheCodeForge — Event Loop Execution Order Demonstration const fs = require('fs'); console.log('1. Script Start'); setTimeout(() => { console.log('2. setTimeout (Timer Phase)'); }, 0); setImmediate(() => { console.log('3. setImmediate (Check Phase)'); }); fs.readFile(__filename, () => { console.log('4. File Read (Poll Phase Callback)'); setTimeout(() => console.log('5. Nested setTimeout'), 0); setImmediate(() => console.log('6. Nested setImmediate')); }); process.nextTick(() => { console.log('7. nextTick (Microtask - executes before next phase)'); }); console.log('8. Script End');
8. Script End
7. nextTick
2. setTimeout
3. setImmediate
4. File Read
6. Nested setImmediate
5. Nested setTimeout
Data on the Move: Why Streams are Non-Negotiable
Imagine trying to read a 4GB log file into a 2GB RAM instance. If you use fs.readFile(), your app crashes with an 'Out of Memory' error because it tries to buffer the entire file. This is where Streams come in.
Streams allow you to process data piece by piece (chunks). In a production environment, you should 'pipe' data directly from the source to the destination. This keeps the memory footprint tiny and constant, regardless of the file size. At TheCodeForge, we use streams for everything from file uploads to processing large SQL result sets.
// TheCodeForge — Memory Efficient Stream Processing const fs = require('fs'); const zlib = require('zlib'); const source = fs.createReadStream('./massive_log.txt'); const destination = fs.createWriteStream('./massive_log.txt.gz'); // Piping: Read chunk -> Compress chunk -> Write chunk // This handles backpressure automatically! source .pipe(zlib.createGzip()) .pipe(destination) .on('finish', () => { console.log('Successfully compressed log without breaking the RAM bank. 🚀'); });
.pipe() manages this for you, but in custom implementations, you must check the return value of .write().| Feature | process.nextTick() | setImmediate() | setTimeout(0) |
|---|---|---|---|
| Phase | Microtask (Immediately after current op) | Check Phase (After Poll) | Timer Phase (First phase of loop) |
| Priority | Highest - executes before loop continues | Executes after I/O callbacks | Executes after timer threshold expires |
| Risk | Can starve I/O if used in a loop | Safe for I/O bound operations | Unreliable precision (minimum ~1-4ms) |
🎯 Key Takeaways
- The Event Loop is what makes Node.js scalable; understand its phases to predict execution order.
- Streams prevent memory exhaustion by processing data in chunks rather than buffering.
- Node is single-threaded, but not single-process. Use the Cluster module for horizontal scaling on a single machine.
- Always use the 'io.thecodeforge' package mindset: clean, asynchronous, and memory-conscious.
⚠ Common Mistakes to Avoid
Interview Questions on This Topic
- QExplain the 'Starvation' problem in the context of process.nextTick(). How would you diagnose it?
- QWhat is the 'Error-First Callback' pattern, and why did it become the standard before Promises arrived?
- QGiven two files, how would you merge them into a third file using streams while ensuring you don't exceed 50MB of RSS memory?
- QHow does the V8 Garbage Collector interact with the Event Loop? What happens during a 'Stop-the-world' event?
- QImplement a simple 'Rate Limiter' using only the native Node.js HTTP module and an object-based cache.
Frequently Asked Questions
Is Node.js truly single-threaded?
Node.js is 'event-loop' single-threaded, but it's not strictly single-threaded. Libuv maintains a thread pool (usually 4 threads) to handle heavy tasks like file I/O, DNS lookups, and crypto functions that would otherwise block the main loop.
When should you use Worker Threads instead of the Cluster module?
Use Cluster to scale web servers (share the same port, handle more connections). Use Worker Threads for CPU-intensive tasks (data processing, image manipulation) where you need to share memory between threads.
Why is 'require()' synchronous in Node.js?
Modules are typically loaded at startup. Making require() asynchronous would add unnecessary complexity to the dependency graph. Since it caches the result, subsequent calls are near-instant and don't block the loop during request handling.
Developer and founder of TheCodeForge. I built this site because I was tired of tutorials that explain what to type without explaining why it works. Every article here is written to make concepts actually click.