5 Node.js CLI Tools You Can Build in Under an Hour
Building command-line tools is one of the most satisfying things you can do as a developer. There's something deeply rewarding about typing a short command and watching it solve a real problem instantly. And with Node.js, the barrier to entry is remarkably low.
In this tutorial, we'll build five genuinely useful CLI tools from scratch. Each one is under 30 lines of code, solves a real problem you'll encounter daily, and can be published to npm in minutes. By the end, you'll have five tools in your belt and a repeatable workflow for shipping more.
Let's get started.
The Setup: How Every CLI Tool Begins
Every Node.js CLI tool follows the same skeleton. Create a directory, initialize it, and point the bin field at your script:
mkdir my-tool && cd my-tool
npm init -y
Then edit package.json to add a bin entry:
{
"name": "my-tool",
"version": "1.0.0",
"bin": {
"my-tool": "./index.js"
}
}
Your index.js file needs exactly one special line at the top:
#!/usr/bin/env node
That shebang tells your shell to run the file with Node.js. Without it, your OS won't know what to do with the file.
With that foundation in place, let's build something real.
Tool 1: wordcount — Count Words, Lines, and Characters in Files
What it does: A lightweight alternative to wc that gives you clean, human-readable output for any file.
The classic wc command works fine, but its output is cryptic and its flags are easy to forget. Let's build something friendlier.
The Code
Create a new directory called wordcount, run npm init -y, and create index.js:
#!/usr/bin/env node
const fs = require('fs');
const path = require('path');
const file = process.argv[2];
if (!file) {
console.error('Usage: wordcount <file>');
process.exit(1);
}
try {
const content = fs.readFileSync(path.resolve(file), 'utf-8');
const lines = content.split('\n').length;
const words = content.split(/\s+/).filter(Boolean).length;
const chars = content.length;
console.log(` Lines: ${lines.toLocaleString()}`);
console.log(` Words: ${words.toLocaleString()}`);
console.log(` Characters: ${chars.toLocaleString()}`);
console.log(` File: ${path.basename(file)}`);
} catch (err) {
console.error(`Error: Cannot read file "${file}"`);
process.exit(1);
}
How It Works
We read the file synchronously (perfectly fine for a CLI tool), split on newlines to count lines, split on whitespace to count words, and use .length for characters. The toLocaleString() call adds commas to large numbers, which is a small touch that makes output much easier to scan.
Use Case
You're writing a blog post and the publication requires 2,000-2,500 words. Instead of pasting into an online word counter, just run:
wordcount article.md
Instant answer, no browser required.
Tool 2: jsonformat — Pretty-Print and Validate JSON From Stdin
What it does: Reads JSON from standard input, validates it, and outputs it with clean indentation.
Working with APIs means working with JSON, and raw JSON responses are virtually unreadable. This tool takes messy, compressed JSON and makes it beautiful.
The Code
Create a jsonformat directory, run npm init -y, and create index.js:
#!/usr/bin/env node
let input = '';
process.stdin.setEncoding('utf-8');
process.stdin.on('data', (chunk) => { input += chunk; });
process.stdin.on('end', () => {
if (!input.trim()) {
console.error('Error: No input received. Pipe JSON into this command.');
console.error('Usage: echo \'{"key":"value"}\' | jsonformat');
process.exit(1);
}
try {
const parsed = JSON.parse(input);
console.log(JSON.stringify(parsed, null, 2));
} catch (err) {
console.error(`Invalid JSON: ${err.message}`);
const match = err.message.match(/position (\d+)/);
if (match) {
const pos = parseInt(match[1]);
const snippet = input.substring(Math.max(0, pos - 20), pos + 20);
console.error(`Near: ...${snippet}...`);
}
process.exit(1);
}
});
if (process.stdin.isTTY) {
console.error('Usage: echo \'{"key":"value"}\' | jsonformat');
process.exit(1);
}
How It Works
The tool reads from standard input using Node's stream interface. When the stream ends, it tries to parse the input as JSON. If parsing succeeds, it pretty-prints with two-space indentation. If it fails, it shows the error message and highlights the area around the problematic character position -- a much more helpful error than what you'd get from a raw JSON.parse crash.
The isTTY check at the bottom catches cases where someone runs jsonformat without piping anything in, and shows usage instead of hanging.
Use Case
You just hit an API with curl and the response is a 500-character single line of JSON:
curl -s https://api.github.com/users/octocat | jsonformat
Instantly readable.
Tool 3: base64cli — Encode and Decode Base64 From the Terminal
What it does: Encodes or decodes Base64 strings directly from your terminal, handling both text and file input.
Base64 encoding comes up constantly -- embedding images, reading JWT tokens, working with authentication headers. Most developers Google "base64 encode online" every time. Let's fix that.
The Code
Create a base64cli directory, run npm init -y, and create index.js:
#!/usr/bin/env node
const fs = require('fs');
const args = process.argv.slice(2);
const mode = args[0];
const input = args.slice(1).join(' ');
function showUsage() {
console.log('Usage:');
console.log(' base64cli encode <text>');
console.log(' base64cli decode <base64string>');
console.log(' base64cli encode-file <filepath>');
process.exit(1);
}
if (!mode || !input) showUsage();
switch (mode) {
case 'encode':
console.log(Buffer.from(input).toString('base64'));
break;
case 'decode':
try {
console.log(Buffer.from(input, 'base64').toString('utf-8'));
} catch {
console.error('Error: Invalid base64 input');
process.exit(1);
}
break;
case 'encode-file':
try {
const data = fs.readFileSync(input);
console.log(data.toString('base64'));
} catch {
console.error(`Error: Cannot read file "${input}"`);
process.exit(1);
}
break;
default:
showUsage();
}
How It Works
Node's built-in Buffer class handles all the heavy lifting. Buffer.from(string).toString('base64') encodes, and Buffer.from(string, 'base64').toString('utf-8') decodes. We add a third mode, encode-file, that reads a binary file and outputs its Base64 representation -- handy for embedding small images in CSS or HTML.
Use Case
You need to create a Basic Auth header for an API request:
base64cli encode "username:password"
# Output: dXNlcm5hbWU6cGFzc3dvcmQ=
Or decode a JWT payload:
base64cli decode "eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIn0"
# Output: {"sub":"1234567890","name":"John Doe"}
Tool 4: randomgen — Generate Random Strings, UUIDs, and Passwords
What it does: Generates cryptographically secure random strings, UUIDs, and passwords with a single command.
Every developer needs random strings constantly: API keys for testing, temporary passwords, unique identifiers. This tool makes it instant.
The Code
Create a randomgen directory, run npm init -y, and create index.js:
#!/usr/bin/env node
const crypto = require('crypto');
const type = process.argv[2];
const length = parseInt(process.argv[3]) || 16;
function uuid() {
return crypto.randomUUID();
}
function randomString(len, chars) {
let result = '';
const bytes = crypto.randomBytes(len);
for (let i = 0; i < len; i++) {
result += chars[bytes[i] % chars.length];
}
return result;
}
switch (type) {
case 'string':
console.log(randomString(length, 'abcdefghijklmnopqrstuvwxyz0123456789'));
break;
case 'password':
console.log(randomString(length, 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!@#$%^&*'));
break;
case 'hex':
console.log(crypto.randomBytes(length).toString('hex'));
break;
case 'uuid':
console.log(uuid());
break;
case 'number':
console.log(randomString(length, '0123456789'));
break;
default:
console.log('Usage: randomgen <type> [length]');
console.log('Types: string, password, hex, uuid, number');
process.exit(1);
}
How It Works
We use Node's crypto module, which provides cryptographically secure random number generation. Unlike Math.random(), crypto.randomBytes() produces values suitable for security-sensitive applications like password generation. The randomString helper maps random bytes onto a character set, and we provide five distinct modes for different situations.
Note that crypto.randomUUID() is available in Node 19+ and generates RFC 4122 v4 UUIDs natively, with no dependencies needed.
Use Case
Need a secure password for a new service account?
randomgen password 24
# Output: kQ7#mR2&xP9!nL4@vB8$wT3
Need a UUID for a database seed?
randomgen uuid
# Output: f47ac10b-58cc-4372-a567-0e02b2c3d479
Need a hex token for an API key?
randomgen hex 32
# Output: a3f2b8c1d4e5f6789012345678abcdef...
Tool 5: urlencode — URL Encode and Decode From the Terminal
What it does: Encodes or decodes URL components directly from the command line, handling special characters correctly.
If you've ever manually replaced spaces with %20 or tried to decode a query string full of percent-encoded characters, you know this pain. Let's kill it.
The Code
Create a urlencode directory, run npm init -y, and create index.js:
#!/usr/bin/env node
const mode = process.argv[2];
const input = process.argv.slice(3).join(' ');
if (!mode || !input) {
console.log('Usage:');
console.log(' urlencode encode <text>');
console.log(' urlencode decode <encoded-text>');
console.log(' urlencode params <key=value&key2=value2>');
process.exit(1);
}
switch (mode) {
case 'encode':
console.log(encodeURIComponent(input));
break;
case 'decode':
try {
console.log(decodeURIComponent(input));
} catch {
console.error('Error: Invalid URL-encoded input');
process.exit(1);
}
break;
case 'params': {
const params = new URLSearchParams(input);
const maxKey = Math.max(...[...params.keys()].map(k => k.length));
for (const [key, value] of params) {
console.log(` ${key.padEnd(maxKey)} = ${value}`);
}
break;
}
default:
console.error(`Unknown mode: "${mode}". Use encode, decode, or params.`);
process.exit(1);
}
How It Works
JavaScript's built-in encodeURIComponent and decodeURIComponent handle the encoding correctly according to the RFC 3986 standard. The bonus params mode parses a full query string using the URLSearchParams API and displays each key-value pair in a clean, aligned table. This is incredibly useful when debugging OAuth callbacks or API redirects where the query string is a wall of encoded text.
Use Case
You're debugging an OAuth redirect and the URL contains:
?redirect_uri=https%3A%2F%2Fapp.example.com%2Fcallback%3Fstate%3Dabc123&scope=read%20write
Just decode it:
urlencode decode "https%3A%2F%2Fapp.example.com%2Fcallback%3Fstate%3Dabc123"
# Output: https://app.example.com/callback?state=abc123
Or parse the full query string:
urlencode params "redirect_uri=https://app.example.com&scope=read write&state=abc123"
# Output:
# redirect_uri = https://app.example.com
# scope = read write
# state = abc123
Publishing to npm: The Five-Minute Process
You've built your tool. Now let's put it on npm so anyone can install it with a single command. The process is identical for all five tools.
Step 1: Polish Your package.json
Make sure these fields are filled in:
{
"name": "your-tool-name",
"version": "1.0.0",
"description": "A one-sentence description of what it does",
"bin": {
"your-tool-name": "./index.js"
},
"keywords": ["cli", "tool", "terminal"],
"author": "Your Name",
"license": "MIT"
}
The name field must be unique on npm. Check availability at https://www.npmjs.com/package/your-tool-name before publishing. If the name is taken, prefix it with your npm username: @yourusername/your-tool-name.
Step 2: Make Your Script Executable
chmod +x index.js
Step 3: Test Locally
npm link
your-tool-name --help
npm link creates a global symlink so you can test the command as if it were installed globally. Fix any issues you find.
Step 4: Publish
npm login
npm publish
That's it. Your tool is now available to every developer in the world. Anyone can install it with:
npm install -g your-tool-name
Step 5: Iterate
After publishing, listen to feedback. Add a --help flag. Add a --version flag. Handle edge cases. Each improvement is just another npm publish away (after bumping the version in package.json).
Lessons From Building Small Tools
Building these five tools teaches you several important principles that scale to larger projects:
Standard input is powerful. The jsonformat tool reads from stdin, which means it composes with every other command-line tool via pipes. Design your tools to work with pipes whenever possible. A tool that reads from stdin and writes to stdout fits into the Unix ecosystem like a Lego brick.
Node's standard library is enough. None of these tools required a single dependency. The fs, crypto, path, and Buffer modules covered everything. Fewer dependencies mean faster installs, fewer security risks, and less maintenance.
Error messages matter. Every tool above provides clear, actionable error messages. When jsonformat encounters invalid JSON, it shows you where the error is. When wordcount can't find a file, it tells you which file. Good error messages are the difference between a tool people use once and a tool people rely on.
Small tools compose into workflows. Chain these tools together:
curl -s https://api.example.com/data | jsonformat | wordcount
Each tool does one thing well, and together they form a pipeline that's more powerful than any single monolithic tool.
What To Build Next
Once you've got the workflow down, ideas are everywhere. A few suggestions:
- hashfile -- compute MD5/SHA256 hashes of files
- csv2json -- convert CSV files to JSON
- colorpick -- convert between hex, RGB, and HSL color formats
- portcheck -- check if a port is in use on localhost
-
envdiff -- compare two
.envfiles and show differences
The pattern is always the same: identify a small annoyance, write 20-30 lines of code, publish, and move on. Over time, you'll build a personal toolkit that saves you hours every week.
Happy building.
Top comments (0)