mediumBackend EngineerTechnology
How does the Node.js event loop handle concurrent connections, and what are the common patterns to avoid blocking it?
Posted 18/04/2026
by Mehedy Hasan Ador
Question Details
At a high-traffic startup interview:
> "Our Express API sometimes has 5-second response times under load. We discovered a route that does synchronous file reading for 10K users. How does this affect all other requests, and how would you fix it?"
> "Our Express API sometimes has 5-second response times under load. We discovered a route that does synchronous file reading for 10K users. How does this affect all other requests, and how would you fix it?"
Suggested Solution
Node.js Single-Threaded Event Loop
Node.js uses a single thread for JavaScript execution but handles I/O asynchronously via libuv (C library with a thread pool for I/O operations).┌──────────────────────────┐
│ Event Loop (1 thread) │
│ ┌─────────────────────┐ │
│ │ Timers │ │
│ │ Pending callbacks │ │
│ │ Idle, prepare │ │
│ │ Poll (I/O) │◄──── libuv thread pool (4 threads)
│ │ Check (setImmediate)│ │
│ │ Close callbacks │ │
│ └─────────────────────┘ │
└──────────────────────────┘
The Blocking Problem
// ❌ BLOCKS the event loop for ALL users
app.get("/users", (req, res) => {
const data = fs.readFileSync("users.json"); // Sync = blocks thread
const processed = heavySyncComputation(data); // CPU-bound = blocks thread
res.json(processed);
});
// While this runs, ALL other requests wait in the queue
Fixes
1. Use Async I/O
// ✅ Non-blocking I/O
app.get("/users", async (req, res) => {
const data = await fs.promises.readFile("users.json", "utf-8");
res.json(JSON.parse(data));
});
2. Offload CPU-heavy Work
// ✅ Worker threads for CPU-bound tasks
const { Worker } = require("workerthreads");
app.get("/process", (req, res) => {
const worker = new Worker("./heavy-task.js", {
workerData: { input: req.body }
});
worker.on("message", (result) => res.json(result));
worker.on("error", (err) => res.status(500).json({ error: err.message }));
});
3. Use Streams for Large Data
// ✅ Stream processing — constant memory
app.get("/export", (req, res) => {
const stream = fs.createReadStream("large-file.csv");
stream.pipe(csvParser()).pipe(res);
});
Performance Comparison
readFileSyncreadFile (async)createReadStreamThe libuv Thread Pool
Default: 4 threads for file I/O, DNS lookups, etc. If all 4 are busy, I/O queues up.Increase thread pool size (max 128)
UVTHREADPOOLSIZE=8 node server.js
Rule: Never use synchronous operations in request handlers. The event loop serves ALL users — blocking it blocks everyone.