From 0 to 60 npm Packages: What I Learned Building CLI Tools at Scale
An honest retrospective on mass-producing open source tools — the portfolio effect, the download reality, and why you should probably build 5 excellent tools instead.
There is a particular flavor of madness that strikes developers at 2 AM. It whispers: "You know what the world needs? Another CLI tool." Most people hear this voice once, build a todo app, and move on. I heard it sixty times in two sessions and actually shipped every single one.
Over the course of roughly 48 hours, I went from zero published npm packages to sixty. Not prototypes. Not half-finished repos. Sixty published, installable, documented CLI tools — each with a README.md, a package.json, TypeScript source, and a real entry on the npm registry.
This is the story of what that looked like, what I learned, and why I'd tell you not to do it the way I did.
The Journey: Zero to Sixty
It started innocently. I wanted a tool to snapshot web pages as readable text for LLM context windows. That became websnap-reader. Then I needed something to scan GitHub for bounties. That became ghbounty. Then a pitch generator for developer articles. Then a price monitor. Then a README generator.
Five tools in, I had a system. A template. A rhythm. And once you have a rhythm for publishing npm packages, the marginal cost of the next one drops to almost nothing. Tool six took twenty minutes. Tool fifteen took twelve. By tool forty, I could scaffold, write, test, and publish a new CLI tool in under eight minutes.
The first session produced 19 tools. The second session — fueled by the momentum of the first — produced 41 more. By the end, I had 60 packages on npm, covering everything from git workflow automation to security scanners to file manipulation utilities.
Was this a good idea? Let me get back to you on that.
The Categories
When you build sixty tools without a master plan, you end up with an accidental taxonomy. Here's roughly how mine broke down:
Git Tools (10 packages)
These were the most natural starting point. Every developer has git friction, and every git friction point is a potential CLI tool. I built tools for interactive branch cleanup, commit message linting, repo statistics, changelog generation, merge conflict helpers, git hook managers, and more. The git category had the highest density of "actually useful" tools because the problem space is so well-defined.
npm Ecosystem Tools (8 packages)
Meta, I know — using npm to publish tools about npm. These included dependency auditors, package.json validators, version bump helpers, publish checklist enforcers, and download trackers. Building tools for the ecosystem you're publishing to creates a satisfying recursive loop, even if nobody else appreciates the irony.
API & Network Tools (7 packages)
REST API testers, endpoint monitors, webhook debuggers, rate limit trackers. These tools taught me the most about error handling because network operations fail in creative and unpredictable ways. A CLI tool that calls an API needs to handle timeouts, auth failures, rate limits, DNS resolution errors, malformed responses, and the case where the API just... doesn't exist anymore.
File & Text Manipulation (6 packages)
Bulk renamers, format converters, template engines, text transformers. These are the bread-and-butter Unix philosophy tools — read from stdin, do one thing, write to stdout. They were the fastest to build and the most satisfying to use because the feedback loop is instant.
Developer Environment (5 packages)
Dotfile managers, environment variable validators, project scaffolders, development server helpers, and workspace organizers. These tools solve real problems but face the toughest competition because every developer has strong opinions about their environment setup.
Security & Analysis (4 packages)
Dependency vulnerability scanners, secret detectors, license compliance checkers, and code complexity analyzers. These were the most technically challenging because they required parsing and understanding code structure rather than just manipulating text.
Everything Else (20 packages)
Price monitors. README generators. Pitch drafters. Bounty scanners. Article formatters. Markdown linters. Color palette generators. Placeholder image tools. Lorem ipsum variants. A CLI tool that generates names for CLI tools (yes, really). The "misc" category is where ambition meets reality — these tools exist because I had momentum and a template, not because the world was crying out for another lorem ipsum generator.
The 5 Tools I'm Most Proud Of
1. websnap-reader
This was the first tool and remains the most genuinely useful. It takes a URL, renders the page, strips the chrome, and outputs clean readable text optimized for LLM context windows. In a world where everyone is building AI wrappers, a tool that cleanly bridges the web-to-LLM gap actually fills a real niche. It handles JavaScript-rendered pages, strips navigation and ads, preserves document structure, and outputs token-efficient text.
$ websnap-reader https://example.com/blog-post
# Blog Post Title
By Author Name | March 2026
The actual content of the article, cleanly extracted
and formatted for readability...
[2,847 tokens | 3.2s render time]
2. ghbounty
A GitHub bounty scanner that monitors repositories with active bounty programs — Algora, Expensify, cal.com, and others. It filters by language, bounty amount, and freshness. The tool taught me that in the bounty world, speed is everything. By the time you've analyzed an issue, someone has already submitted a PR. The 30-minute window for Expensify bounties is particularly ruthless.
$ ghbounty scan --min-value 100 --lang typescript
$500 expensify/app #45231 - Fix PDF rendering in chat
$250 calcom/cal.com #12847 - Timezone edge case
$150 asyncapi/studio #891 - Schema validation bug
Found 3 bounties matching criteria (scanned 847 issues)
3. repo-readme-gen
This tool analyzes a repository's structure, dependencies, and code patterns, then generates a comprehensive README. It's not perfect — no generated documentation ever is — but it gets you 70% of the way there in seconds instead of the hour it takes to write a good README from scratch. The output includes badges, installation instructions, API documentation pulled from JSDoc comments, and even a contributing guide.
4. pricemon
A price monitoring CLI that tracks products across multiple retailers and alerts on drops. The interesting technical challenge was handling the variety of ways different sites structure their pricing — some use microdata, some use specific CSS classes, some render prices client-side. Building a robust price extractor is essentially building a mini web scraper that has to handle dozens of edge cases.
5. devpitch
A developer article pitch generator that takes a topic, researches existing coverage, identifies angles that haven't been covered, and drafts a pitch email. This tool is useful because the hardest part of writing paid articles isn't the writing — it's convincing an editor that your angle is fresh. Having a tool that does competitive analysis on existing articles saves real time.
The 5 Tools Nobody Will Ever Use
1. cli-name-gen
A CLI tool that generates names for CLI tools. I built it because I was running out of creative names around tool thirty. It combines action verbs with tech nouns and checks npm for availability. It's the most self-referential thing I've ever built, and its total lifetime downloads will probably be in the single digits — all from me.
2. Lorem Ipsum Variant #4
I built four different lorem ipsum generators with different themes. Developer jargon lorem ipsum. Startup pitch lorem ipsum. Error message lorem ipsum. The fourth one generates lorem ipsum text that is itself valid JavaScript. Nobody needs this. I built it because the template was right there and the idea made me laugh.
3. color-name-cli
Give it a hex code, it tells you the nearest named CSS color. That's it. The entire useful surface area of this tool can be replicated by googling "hex to color name." But it works offline, so there's that.
4. dep-age
Tells you how old each of your npm dependencies is. Not whether they're outdated — npm outdated does that. Just how old they are in human-readable time. "express: 2 years, 4 months, 12 days old." It's technically interesting and practically useless. But building it taught me about the npm registry API, which fed into three other more useful tools.
5. git-yolo
Stages everything, commits with "yolo" as the message, and force-pushes to main. I built it as a joke. It has a --are-you-sure flag that defaults to false. I'm still terrified someone will actually use it in production. The README is 80% warnings.
Why did I build tools I knew nobody would use? Because each one took less than ten minutes, each one taught me something small, and collectively they padded out the portfolio in ways that — unexpectedly — mattered. More on that later.
Technical Evolution: Tool 1 to Tool 60
The code quality difference between my first and sixtieth tool is embarrassing. Here's what changed:
Error Handling. Tool 1 had try-catch blocks that logged "Something went wrong." Tool 60 has typed error hierarchies, user-friendly messages with suggested fixes, debug flags that show stack traces, and proper exit codes. The evolution happened gradually — every time a tool failed in a confusing way, I improved the error handling pattern and carried it forward.
CLI Argument Parsing. I started with manual process.argv parsing. Moved to yargs. Eventually standardized on commander with a wrapper that enforced consistent --help output, version flags, and configuration file support. By tool 40, adding a new CLI flag to any tool was a one-line change.
TypeScript Strictness. Early tools had any scattered everywhere. By the end, I was running strict mode with no-implicit-any, no-unused-locals, and exhaustive switch checking. The type safety caught real bugs — especially in tools that parsed external data.
Testing. The first twenty tools had no tests. Then I had a bad publish where a tool crashed on Node 18 because I used a Node 20 API. After that, every tool got at least a smoke test that verified it could run --help and --version without crashing, and basic input/output tests for core functionality.
Output Formatting. Early tools dumped raw text. Later tools used chalk for coloring, ora for spinners, and cli-table3 for structured output. Small touch, but it makes the difference between a tool that feels like a hack and one that feels like a product.
The Template: My Standardized Project Structure
By tool fifteen, I had a template that I could clone and modify:
my-tool/
src/
index.ts # CLI entry point
lib.ts # Core logic (no CLI dependencies)
types.ts # TypeScript interfaces
errors.ts # Custom error classes
tests/
smoke.test.ts # Can it run without crashing?
core.test.ts # Does the core logic work?
package.json # Standardized scripts and metadata
tsconfig.json # Strict mode, ES2020 target
README.md # Generated then hand-edited
LICENSE # MIT, always
.npmignore # Keep published package clean
The key insight was separating lib.ts from index.ts. The core logic should be importable as a library. The CLI is just one interface to that logic. This separation made testing dramatically easier and opened the door for programmatic usage.
The package.json had standardized scripts: build, test, lint, prepublish. The prepublish script ran the linter and tests automatically, which saved me from publishing broken tools at least a dozen times.
The Publishing Pipeline
Speed was everything. Here's the pipeline that let me publish a new tool in under ten minutes:
- Clone template (30 seconds). Copy the scaffold, update the name and description.
- Write core logic (3-7 minutes). This is the actual creative work. Everything else is mechanical.
-
Smoke test locally (30 seconds). Run it. Does it work? Does
--helplook right? -
Publish (1 minute).
npm publish --access public. The prepublish hook runs the build and tests automatically. -
Verify (30 seconds).
npx my-tool --versionfrom a clean directory. If it works, move on.
No CI/CD pipeline. No GitHub Actions. No staging environments. For tools that might have zero users, the overhead of a full DevOps setup isn't justified. I know this is heresy. I don't care. The tools work, and if one of them gains traction, I can add CI later.
Downloads Reality: The Honest Numbers
Let's talk about the elephant in the registry. Here are my actual download numbers after publishing sixty tools:
- Tools with 50+ weekly downloads: 2
- Tools with 10-50 weekly downloads: 5
- Tools with 1-9 weekly downloads: 15
- Tools with 0 weekly downloads: 38
That's right. The majority of my tools have literally zero downloads. Not "low downloads." Zero. Nobody has ever installed them. They exist on the npm registry like messages in bottles that washed up on an island with no inhabitants.
The two tools with meaningful downloads are websnap-reader and repo-readme-gen — both solving genuine, immediate problems that developers actually google for. Everything else is either too niche, too redundant with existing tools, or too weird to attract organic discovery.
Here's the thing about npm: discoverability is almost entirely driven by search and word-of-mouth. If your tool name doesn't match what people search for, and you don't actively promote it, it will have zero downloads forever. Publishing alone does nothing. The registry has over two million packages. Yours is noise.
The Portfolio Effect
So why do I not regret building sixty tools that nobody uses? Because the portfolio effect is real and powerful.
When a potential client or employer visits my GitHub profile, they see sixty repositories with consistent code quality, clear documentation, and recent activity. They don't check download numbers. They see volume, consistency, and range. It signals that I can ship — quickly, repeatedly, and across different problem domains.
Three things happened after I hit fifty published packages:
Recruiter outreach increased noticeably. My GitHub profile started appearing in recruiter searches. The sheer volume of public TypeScript code triggered whatever algorithms they use to find candidates.
Bounty proposals got accepted more often. When I submit a proposal to fix an issue, maintainers check my profile. Sixty published packages says "this person ships" more convincingly than any cover letter.
Freelance credibility jumped. On platforms like Upwork, linking to sixty published npm packages differentiates you from other applicants instantly. Clients don't evaluate the tools — they evaluate the evidence of productivity.
Is this gaming the system? Maybe. But the tools are real. The code is real. The TypeScript is real. The documentation is real. I didn't generate garbage — I generated sixty small, functional, well-structured tools. The fact that most of them solve problems nobody has doesn't diminish the demonstrated ability to identify problems, architect solutions, and ship them.
The Advice I'd Give Myself
If I could go back to tool zero, here's what I'd tell myself:
Build 5 excellent tools, not 60 mediocre ones. The portfolio effect is real, but it has diminishing returns. The difference between 5 tools and 15 tools on your profile is significant. The difference between 40 and 60 is negligible. Those last twenty tools could have been twenty iterations on the five that actually had users.
Solve your own problems first. The tools that got downloads were the ones I built because I personally needed them. The tools with zero downloads were the ones I built because I had momentum and a template. Momentum is not product-market fit.
Promote one tool instead of publishing ten more. I spent zero time on marketing. Zero blog posts, zero tweets, zero Show HN posts. If I'd spent the time I used building tools 40-60 on promoting tools 1-5 instead, I'd have more users, more stars, and more career impact.
Tests aren't optional even for small tools. The three tools I published with bugs that I only found after someone actually tried to use them? Those bugs cost me more credibility than the other fifty-seven tools earned me. A broken tool is worse than no tool.
TypeScript from the start. My first few tools were plain JavaScript. Converting them to TypeScript later was painful. Starting with TypeScript is barely slower and saves enormous time when you inevitably need to refactor.
The Bigger Picture
Building sixty npm packages taught me that software development at speed is a skill distinct from software development at quality. Both matter. The best developers can do both. I got very good at one and adequate at the other.
It also taught me that the npm ecosystem is vast, welcoming, and almost entirely ignored. Two million packages, and yet there are still genuine gaps — especially in the CLI tool space, where developers increasingly need bridges between traditional development workflows and AI-assisted ones. The tools that got traction were exactly in that gap.
Would I do it again? Not like this. I'd pick ten tool ideas, build five of them well, promote them actively, iterate based on user feedback, and open-source the process. That's the boring answer, but it's the right one.
Sixty packages taught me how to ship. The next five will teach me how to build something people actually want.
Wilson Xu is a developer who builds CLI tools, writes about development workflows, and occasionally builds tools that generate names for other tools. Find all sixty packages on npm or follow the journey on GitHub.
Top comments (0)