Synchronous File I/O in Node.js: How AI Code Blocks Your Event Loop - VibeDoctor 
← All Articles 🐌 Performance Anti-Patterns High

Synchronous File I/O in Node.js: How AI Code Blocks Your Event Loop

AI generates readFileSync and writeFileSync in async contexts. Learn why this freezes your server and how to switch to async I/O.

PERF-002

Quick Answer

Node.js runs on a single-threaded event loop. When AI-generated code calls fs.readFileSync() or fs.writeFileSync(), the entire server freezes - every other request waits until the file operation finishes. Replacing sync I/O with async equivalents (fs.promises.readFile() / fs.promises.writeFile()) allows the event loop to handle other requests while I/O is in progress.

Why Synchronous I/O Kills Node.js Performance

Node.js is fundamentally different from multi-threaded runtimes like Java or Python. It uses a single event loop to handle all incoming requests. When that loop is doing JavaScript execution, it cannot process any other events - HTTP requests, database callbacks, timers, or anything else. This is called blocking the event loop.

Asynchronous I/O operations like fs.readFile() are non-blocking: Node.js sends the read request to the operating system, registers a callback, and immediately returns to the event loop to handle other work. The OS notifies Node.js when the read completes, and the callback runs.

Synchronous I/O like fs.readFileSync() does the opposite: it tells Node.js to stop everything, wait for the file read to complete, then continue. During this wait - which might be 1ms for a small config file or 500ms for a large log file - your server handles zero other requests.

According to Node.js foundation benchmarks, a single 100ms synchronous file read can drop server throughput from 10,000 requests/sec to under 10 requests/sec when under concurrent load. The impact is non-linear: a brief block cascades into a backlog of waiting connections, each holding open a socket and memory.

How AI Tools Generate Sync I/O

AI code generators produce synchronous file operations in several common contexts. The simplest is reading a configuration file. Ask Cursor or Bolt to "read a JSON config file at startup" and you often get:

// ❌ BAD - Blocks the event loop on every request (PERF-002)
import fs from 'fs';
import path from 'path';

export function getAppConfig() {
  // readFileSync blocks the entire server while reading
  const configPath = path.join(process.cwd(), 'config.json');
  const raw = fs.readFileSync(configPath, 'utf-8');
  return JSON.parse(raw);
}

// In an Express route or Next.js API handler:
export async function GET(req: Request) {
  const config = getAppConfig(); // Called on every request - blocks every time
  return Response.json(config);
}

There are two problems here. First, readFileSync blocks the event loop. Second, the config is read from disk on every request rather than being cached at startup. Even if you fix the sync issue, re-reading a file on every request is wasteful.

// ✅ GOOD - Async I/O + startup caching (PERF-002 resolved)
import { readFile } from 'fs/promises';
import path from 'path';

let cachedConfig: AppConfig | null = null;

async function loadConfig(): Promise {
  if (cachedConfig) return cachedConfig;
  const configPath = path.join(process.cwd(), 'config.json');
  const raw = await readFile(configPath, 'utf-8'); // non-blocking
  cachedConfig = JSON.parse(raw);
  return cachedConfig;
}

export async function GET(req: Request) {
  const config = await loadConfig(); // async - event loop stays free
  return Response.json(config);
}

The async version uses fs/promises (available in Node.js 10+, stable in Node.js 14+) and caches the result after the first load. The event loop is free to handle other requests while the file is being read.

Common Sync I/O Patterns in AI-Generated Code

Sync Pattern Async Replacement Notes
fs.readFileSync(path) await fs.promises.readFile(path) Most common - config files, templates
fs.writeFileSync(path, data) await fs.promises.writeFile(path, data) Log files, generated output, uploads
fs.existsSync(path) await fs.promises.access(path) File existence check before read
fs.mkdirSync(path) await fs.promises.mkdir(path, { recursive: true }) Creating upload or temp directories
fs.readdirSync(path) await fs.promises.readdir(path) Listing directory contents
fs.statSync(path) await fs.promises.stat(path) File size / metadata checks

Every synchronous fs.*Sync() method has a direct async equivalent in fs.promises.*. The migration is mechanical: add await, use the fs.promises namespace, and make the containing function async.

When Sync I/O Is Acceptable

There are a small number of legitimate use cases for synchronous file I/O in Node.js:

  1. Module initialization at startup: Code that runs once when the process starts (not per-request) can use sync I/O. require('./config.json') itself is synchronous. Reading a file synchronously in your top-level module initialization is generally fine.
  2. CLI tools: If your code is a command-line tool that processes one task and exits (not a long-running server), synchronous I/O is simpler and the blocking is not a problem.
  3. Test setup: Reading fixtures or test data synchronously in test setup code is acceptable since tests typically run sequentially.

The critical rule is: never use synchronous I/O in request handlers, route functions, middleware, or any code that runs while the server is serving traffic. If the sync call is inside an Express handler, a Next.js API route, a Supabase Edge Function, or a Vercel serverless function, it will block the event loop under load.

How to Find Sync I/O in Your Codebase

Sync I/O is easy to audit with a text search. Look for these function calls outside of module-level initialization code:

For each match, determine whether it is inside a request handler. If it is, replace it with the async equivalent. Tools like VibeDoctor (vibedoctor.io) automatically scan your codebase for synchronous file I/O in async contexts and flag specific file paths and line numbers. Free to sign up.

GitHub's analysis of Node.js production codebases found that sync I/O in request handlers is present in 22% of Express and Next.js applications, and is the cause of a significant portion of "mystery slowdowns" where server response times degrade under moderate concurrent load but appear fine during single-user testing.

Observing the Impact with a Benchmark

The performance difference between sync and async I/O is measurable under concurrency. A simple benchmark reading a 50KB JSON file:

Method Concurrency 1 Concurrency 10 Concurrency 50
readFileSync ~5ms ~50ms (queued) ~250ms (queued)
fs.promises.readFile ~5ms ~6ms (parallel) ~8ms (parallel)
Cached (async, read once) <0.1ms <0.1ms <0.1ms

At concurrency 1 (single user), you cannot tell the difference. At concurrency 50 (modest production load), synchronous reads are 30x slower because each request queues behind all others. This explains why vibe-coded apps often feel fast during development but slow down immediately when real users arrive simultaneously.

FAQ

Does this apply to Vercel and serverless functions?

Yes and no. Serverless functions handle one request at a time per instance, so sync I/O does not cause the same cascading queue problem as in a long-running Node.js server. However, sync I/O still increases your function's execution time (and therefore cost), and if your Vercel function is invoked concurrently, new invocations must wait for cold starts while in-flight requests block. Async I/O is still the correct pattern.

Is require() synchronous? Should I use import() instead?

require() is synchronous, but it is only called at module initialization time (when the process starts), not per-request. This is acceptable. Dynamic require() inside a request handler is a problem - use dynamic import() (which returns a Promise) if you need to load a module conditionally at request time.

How does the Node.js event loop relate to worker threads?

Worker threads are separate Node.js instances that each have their own event loop. CPU-intensive tasks can be offloaded to worker threads so they do not block the main event loop. File I/O is already handled by libuv's thread pool under the hood when you use async methods - you do not need to manually move file operations to worker threads.

What about reading environment variables with process.env?

process.env reads from memory, not from disk - it is synchronous but does not involve I/O. Accessing process.env.MY_VAR is safe in any context. The issue is only with actual disk reads and writes via the fs module.

Will ESLint catch readFileSync in route handlers?

Not by default. There is no built-in ESLint rule that detects sync fs operations. Some custom rule sets and plugins address this, but it is not covered by standard configurations. Static analysis tools like VibeDoctor's PERF-002 check specifically look for *Sync calls inside async function bodies and flag them as event loop blockers.

Scan your codebase for this issue - free

VibeDoctor checks for PERF-002 and 128 other issues across 15 diagnostic areas.

SCAN MY APP →
← Back to all articles View all 129+ checks →