Make Your CLI Tool 10x Faster: Performance Patterns for Node.js
Users notice slow CLI tools. If --help takes a second to appear, if a simple check takes 3 seconds when it should take 300ms, if startup feels sluggish — users will find alternatives. Speed is a feature.
This article covers the performance patterns that make the biggest difference in Node.js CLI tools: lazy imports, parallel I/O, streaming over buffering, caching, and startup optimization.
1. Lazy Imports: The Biggest Win
The #1 performance killer in CLI tools is importing everything at startup. If your tool has 10 commands but only runs one at a time, why load all 10?
// SLOW: imports everything on startup (~500ms)
import lighthouse from 'lighthouse'; // Heavy module
import * as chromeLauncher from 'chrome-launcher';
import Table from 'cli-table3';
import { createCanvas } from 'canvas'; // C++ addon, slow to load
// FAST: only import what's needed for the specific command (~50ms)
program
.command('audit <url>')
.action(async (url) => {
const { default: lighthouse } = await import('lighthouse');
const chromeLauncher = await import('chrome-launcher');
// Now these only load when the audit command is actually used
});
program
.command('report')
.action(async () => {
const Table = (await import('cli-table3')).default;
// cli-table3 only loads for the report command
});
Measurement: wrap your startup in performance.now():
const start = performance.now();
// ... imports and setup ...
const startupMs = Math.round(performance.now() - start);
if (process.argv.includes('--debug-timing')) {
process.stderr.write(`Startup: ${startupMs}ms\n`);
}
2. Parallel I/O
Don't await things sequentially when they're independent:
// SLOW: sequential (~1500ms for 3 API calls)
const user = await fetchUser(id);
const repos = await fetchRepos(id);
const stats = await fetchStats(id);
// FAST: parallel (~500ms — only as slow as the slowest call)
const [user, repos, stats] = await Promise.all([
fetchUser(id),
fetchRepos(id),
fetchStats(id),
]);
For file operations:
// SLOW: read files one by one
const results = [];
for (const file of files) {
const content = await readFile(file, 'utf-8');
results.push(processFile(content));
}
// FAST: read all files in parallel (with concurrency limit)
async function parallelMap<T, R>(
items: T[],
fn: (item: T) => Promise<R>,
concurrency = 10,
): Promise<R[]> {
const results: R[] = [];
const executing: Promise<void>[] = [];
for (const item of items) {
const p = fn(item).then(r => { results.push(r); });
executing.push(p);
if (executing.length >= concurrency) {
await Promise.race(executing);
executing.splice(executing.findIndex(e => e === p), 1);
}
}
await Promise.all(executing);
return results;
}
// Usage
const results = await parallelMap(files, async (file) => {
const content = await readFile(file, 'utf-8');
return processFile(content);
}, 20); // 20 concurrent file reads
3. Stream Large Data
// SLOW: loads entire file into memory
const data = await readFile('huge.log', 'utf-8');
const lines = data.split('\n').filter(l => l.includes('ERROR'));
// FAST: constant memory, processes line by line
import { createReadStream } from 'node:fs';
import { createInterface } from 'node:readline';
const stream = createReadStream('huge.log');
const rl = createInterface({ input: stream });
const errors = [];
for await (const line of rl) {
if (line.includes('ERROR')) errors.push(line);
}
4. Cache Expensive Operations
import { readFile, writeFile, mkdir, stat } from 'node:fs/promises';
import { join } from 'node:path';
import { homedir } from 'node:os';
import { createHash } from 'node:crypto';
const CACHE_DIR = join(homedir(), '.mytool', 'cache');
async function cachedFetch<T>(
key: string,
fetcher: () => Promise<T>,
ttlMs = 3600000, // 1 hour default
): Promise<T> {
const hash = createHash('md5').update(key).digest('hex');
const cachePath = join(CACHE_DIR, `${hash}.json`);
try {
const stats = await stat(cachePath);
if (Date.now() - stats.mtimeMs < ttlMs) {
return JSON.parse(await readFile(cachePath, 'utf-8'));
}
} catch {}
const result = await fetcher();
await mkdir(CACHE_DIR, { recursive: true });
await writeFile(cachePath, JSON.stringify(result));
return result;
}
// Usage
const packageInfo = await cachedFetch(
`npm:${packageName}`,
() => fetch(`https://registry.npmjs.org/${packageName}`).then(r => r.json()),
300000, // Cache for 5 minutes
);
5. Worker Threads for CPU-Heavy Tasks
import { Worker, isMainThread, parentPort, workerData } from 'node:worker_threads';
function runInWorker<T>(workerPath: string, data: unknown): Promise<T> {
return new Promise((resolve, reject) => {
const worker = new Worker(workerPath, { workerData: data });
worker.on('message', resolve);
worker.on('error', reject);
});
}
// In the worker file:
if (!isMainThread) {
const result = heavyComputation(workerData);
parentPort!.postMessage(result);
}
6. Minimize Startup Dependencies
Audit what loads at startup:
node --require 'node:module' -e "
const Module = require('module');
const orig = Module._load;
const loaded = [];
Module._load = function(request, parent) {
loaded.push(request);
return orig.apply(this, arguments);
};
require('./bin/mytool.js');
console.log('Modules loaded:', loaded.length);
console.log(loaded.filter(m => !m.startsWith('node:')).join('\n'));
"
7. Use --json to Skip Formatting
Formatting tables and colors takes time. When output goes to a pipe, skip it:
if (!process.stdout.isTTY || options.json) {
// Fast path: raw JSON, no formatting overhead
console.log(JSON.stringify(results));
} else {
// Human path: tables, colors, formatting
printFormattedReport(results);
}
Performance Benchmarks to Target
| Operation | Good | Acceptable | Slow |
|---|---|---|---|
--help |
< 100ms | < 300ms | > 500ms |
--version |
< 50ms | < 200ms | > 300ms |
| Simple check | < 500ms | < 1s | > 2s |
| Network operation | < 2s | < 5s | > 10s |
| File scan (1000 files) | < 1s | < 3s | > 5s |
Conclusion
CLI performance comes from three principles: don't load what you don't need (lazy imports), don't wait for things that can run simultaneously (parallel I/O), and don't hold in memory what you can stream. Apply these patterns and your tool will feel instant.
Wilson Xu optimizes developer tools for speed. Find his 12+ npm packages at npmjs.com/~chengyixu.
Top comments (0)