DEV Community

Wilson Xu
Wilson Xu

Posted on

I Published 50 npm Packages in 2 Days: Here's My Assembly Line

Last week, I published 50 npm packages in 48 hours. Not forks. Not empty placeholders. Fifty working CLI tools, each with a TypeScript codebase, compiled binaries, README documentation, and a listing on Gumroad.

Before you call me insane (or worse, a spammer), let me be completely transparent: most of these tools have zero downloads. A handful have gotten mild traction. I've made exactly $0 in direct revenue from any of them.

So why am I writing this? Because the system I built to do it is genuinely interesting. And the lessons I learned about scale, quality, and the npm ecosystem are worth sharing with anyone who's ever thought about shipping more.

This is the honest story of what happens when you treat npm publishing like a factory floor.


The Origin: Why 50?

It started with a simple observation. I'd been building CLI tools for my own workflows — a git helper here, an API wrapper there — and noticed I was repeating the same setup every time:

  1. mkdir new-tool && cd new-tool
  2. npm init -y
  3. Copy over my tsconfig, install the same deps, write the same arg-parsing boilerplate
  4. Build, test, publish

Each tool took 2-3 hours. But the unique part — the actual logic that made each tool different — was maybe 30 minutes of work. The rest was ceremony.

So I asked: what if I eliminated the ceremony entirely?

What if I could go from "idea" to "published on npm" in 15 minutes?

That question led me down a rabbit hole that ended with 50 packages, an automated pipeline, and a lot of opinions about the npm registry.


The Assembly Line

Here's the system I built, broken into stages:

Stage 1: The Template

Every tool starts from the same TypeScript template. Not a fancy Yeoman generator — just a directory I copy with cp -r:

tool-template/
├── src/
│   └── index.ts          # Entry point with arg parsing
├── tsconfig.json          # Strict mode, ES2020 target
├── package.json           # Pre-configured scripts
├── .gitignore
├── LICENSE                # MIT
└── README.md              # Template with placeholders
Enter fullscreen mode Exit fullscreen mode

The package.json is the real workhorse. It comes pre-loaded with:

{
  "type": "module",
  "bin": {
    "TOOL_NAME": "./dist/index.js"
  },
  "scripts": {
    "build": "tsc",
    "prepublishOnly": "npm run build"
  },
  "devDependencies": {
    "typescript": "^5.4.0"
  }
}
Enter fullscreen mode Exit fullscreen mode

A find-and-replace on TOOL_NAME and TOOL_DESCRIPTION, and the scaffold is ready.

Stage 2: The Logic Layer

This is where actual creativity happens. Each tool gets its core logic written in src/index.ts. I broke my tools into categories (more on that below), and within each category, I developed patterns that made writing new tools nearly mechanical.

For example, every "git tool" follows this pattern:

#!/usr/bin/env node
import { execSync } from 'child_process';

const args = process.argv.slice(2);
const flags = Object.fromEntries(
  args.filter(a => a.startsWith('--')).map(a => {
    const [k, v] = a.replace('--', '').split('=');
    return [k, v ?? true];
  })
);

// Tool-specific logic here
function main() {
  // ...
}

main();
Enter fullscreen mode Exit fullscreen mode

No framework. No commander.js. No yargs. Just raw process.argv parsing. This keeps the dependency count at zero for most tools, which means faster installs and fewer supply chain risks.

Stage 3: Build & Publish

npm run build && npm publish --access public
Enter fullscreen mode Exit fullscreen mode

That's it. The prepublishOnly script ensures TypeScript is compiled before every publish. I run this in a loop across all tool directories:

for dir in tools/*/; do
  cd "$dir"
  npm publish --access public 2>&1 | tee -a ../../publish.log
  cd ../..
done
Enter fullscreen mode Exit fullscreen mode

Stage 4: Documentation & Distribution

After publishing, each tool gets:

  • A Gumroad listing (free, with a "pay what you want" option)
  • A Dev.to article explaining what it does and why
  • A mention in my "CLI tools collection" README

This is the stage where the assembly line metaphor breaks down slightly, because writing good documentation is inherently un-automatable. More on that later.


The Tool Categories

Here's what 50 tools looks like when you break them into categories:

Git & GitHub Tools (8 tools)

  • ghbounty — Scan GitHub issues for bounty labels and dollar amounts
  • git-standup-report — Generate standup reports from git logs
  • repo-readme-gen — Auto-generate README files from code analysis
  • git-branch-cleaner — Delete merged local branches
  • git-log-viz — Visualize commit history in the terminal
  • pr-status-checker — Check PR statuses across multiple repos
  • commit-msg-lint — Validate commit messages against conventions
  • gh-issue-templater — Generate issue templates from past issues

API & Network Tools (6 tools)

  • websnap-reader — Snapshot web pages to readable text
  • api-health-checker — Monitor API endpoint health
  • pricemon — Track product prices across e-commerce sites
  • webhook-debugger — Debug incoming webhooks locally
  • dns-lookup-cli — Enhanced DNS lookup with formatting
  • curl-to-fetch — Convert curl commands to fetch() code

File & System Tools (5 tools)

  • dir-tree-cli — Print directory trees with filtering
  • file-dedup — Find and remove duplicate files
  • dotfile-sync — Sync dotfiles across machines
  • bulk-rename-cli — Batch rename files with patterns
  • disk-usage-viz — Visualize disk usage in the terminal

Developer Utilities (10 tools)

  • devpitch — Generate project pitch documents
  • json-transform-cli — Transform JSON with jq-like syntax
  • env-validator — Validate .env files against schemas
  • dep-checker — Check for outdated dependencies
  • port-finder-cli — Find available ports
  • regex-tester-cli — Test regex patterns interactively
  • color-converter-cli — Convert between color formats
  • base64-cli — Encode/decode base64 from terminal
  • uuid-gen-cli — Generate UUIDs with format options
  • timestamp-cli — Convert between timestamp formats

AI & Automation Tools (6 tools)

  • prompt-builder — Build structured LLM prompts
  • token-counter-cli — Count tokens for various LLM models
  • ai-commit-msg — Generate commit messages with AI
  • doc-summarizer — Summarize documents using AI
  • code-reviewer-cli — AI-powered code review
  • changelog-gen — Generate changelogs from git history

Data & Format Tools (7 tools)

  • csv-to-json-cli — Convert CSV to JSON (and back)
  • markdown-to-html — Render Markdown in terminal
  • yaml-lint-cli — Lint YAML files
  • toml-to-json — Convert TOML to JSON
  • xml-formatter-cli — Format and validate XML
  • schema-gen — Generate JSON schemas from data
  • data-faker-cli — Generate fake data for testing

Misc & Experimental (8 tools)

  • pomodoro-cli — Terminal Pomodoro timer
  • ascii-art-gen — Generate ASCII art from text
  • qr-gen-cli — Generate QR codes in terminal
  • password-gen-cli — Generate secure passwords
  • habit-tracker-cli — Track daily habits from terminal
  • bookmark-cli — Manage bookmarks from terminal
  • note-cli — Quick terminal notes
  • weather-cli — Check weather from terminal

The 10 Tools That Are Genuinely Useful

Out of 50, I'd say about 10 are tools I actually use myself. That's a 20% hit rate, which sounds bad until you realize most product experiments have worse odds.

Here are the ones I'd actually recommend:

1. websnap-reader

npx websnap-reader https://example.com
Enter fullscreen mode Exit fullscreen mode

Captures a web page and converts it to clean, readable text. I use this constantly for feeding web content into LLM prompts.

2. ghbounty

npx ghbounty scan --min-amount 100
Enter fullscreen mode Exit fullscreen mode

Scans GitHub for issues with bounty labels. Filters by amount, language, and recency. This one was born out of actual need — I hunt bounties regularly.

3. repo-readme-gen

npx repo-readme-gen ./my-project
Enter fullscreen mode Exit fullscreen mode

Analyzes your codebase and generates a README. It's not perfect, but it's a solid starting point that saves 20 minutes per project.

4. git-standup-report

npx git-standup-report --days 7
Enter fullscreen mode Exit fullscreen mode

Generates a formatted standup report from your git log. Surprisingly useful for weekly status updates.

5. env-validator

npx env-validator --schema .env.schema --env .env
Enter fullscreen mode Exit fullscreen mode

Validates your .env file against a schema. Catches missing variables before your app crashes in production.

6. json-transform-cli

echo '{"users": [{"name": "Alice"}]}' | npx json-transform-cli '.users[0].name'
Enter fullscreen mode Exit fullscreen mode

Like jq but for people who can never remember jq syntax.

7. token-counter-cli

npx token-counter-cli --model gpt-4 < prompt.txt
Enter fullscreen mode Exit fullscreen mode

Counts tokens for different LLM models. Essential when you're working with context window limits.

8. bulk-rename-cli

npx bulk-rename-cli --pattern '*.jpeg' --replace '.jpg'
Enter fullscreen mode Exit fullscreen mode

Batch rename files with glob patterns. I use this more than I'd like to admit.

9. csv-to-json-cli

npx csv-to-json-cli data.csv > data.json
Enter fullscreen mode Exit fullscreen mode

Simple, zero-config CSV to JSON conversion. Handles edge cases like quoted commas.

10. devpitch

npx devpitch ./my-project --format markdown
Enter fullscreen mode Exit fullscreen mode

Generates a pitch document for your project. Useful for writing README introductions and project proposals.


Technical Patterns: What We Reuse Across All 50 Tools

After building this many tools, certain patterns crystallize:

Pattern 1: Zero-Dependency Argument Parsing

I mentioned this earlier, but it deserves emphasis. Only 8 of my 50 tools use external dependencies. The rest rely on a ~20-line argument parser that handles --flag, --key=value, and positional arguments.

Why? Because every dependency is a liability. When you're publishing 50 packages, multiplying dependencies by 50 means 50 chances for a supply chain issue.

Pattern 2: The #!/usr/bin/env node Shebang

Every single tool starts with this line. Without it, npx won't work. You'd be surprised how many CLI tools forget this and wonder why npx my-tool opens a text file instead of executing.

Pattern 3: Graceful Error Handling

process.on('uncaughtException', (err) => {
  console.error(`Error: ${err.message}`);
  process.exit(1);
});

process.on('unhandledRejection', (reason) => {
  console.error(`Error: ${reason}`);
  process.exit(1);
});
Enter fullscreen mode Exit fullscreen mode

This is at the top of every tool. CLI tools that dump stack traces to users are hostile.

Pattern 4: stdin Piping Support

import { createInterface } from 'readline';

async function readStdin(): Promise<string> {
  if (process.stdin.isTTY) return '';
  const lines: string[] = [];
  const rl = createInterface({ input: process.stdin });
  for await (const line of rl) lines.push(line);
  return lines.join('\n');
}
Enter fullscreen mode Exit fullscreen mode

Every tool that processes text supports piping. This is Unix philosophy 101, but it's easy to forget.

Pattern 5: Colored Output Without Dependencies

const colors = {
  red: (s: string) => `\x1b[31m${s}\x1b[0m`,
  green: (s: string) => `\x1b[32m${s}\x1b[0m`,
  yellow: (s: string) => `\x1b[33m${s}\x1b[0m`,
  blue: (s: string) => `\x1b[34m${s}\x1b[0m`,
  bold: (s: string) => `\x1b[1m${s}\x1b[0m`,
};
Enter fullscreen mode Exit fullscreen mode

No chalk. No kleur. Just ANSI escape codes. Eight lines, zero dependencies.


npm Publishing at Scale: The Ugly Parts

Publishing one package to npm is straightforward. Publishing 50 in two days surfaces problems you never encounter at normal scale.

Rate Limits

npm doesn't officially document their publish rate limits, but I can tell you from experience: after about 15-20 publishes in quick succession, you'll start getting 429 responses. The solution is unglamorous — add a sleep 30 between publishes and go make coffee.

Naming Conflicts

Coming up with 50 unique, available package names is harder than writing the actual code. My naming strategy evolved through three phases:

  1. Descriptive names: json-transform — taken. json-transformer — taken. json-transform-cli — available!
  2. Suffix convention: Adding -cli to everything. This actually works well because it signals that the package is a command-line tool.
  3. Scoped packages: When all else fails, @wilsonxu/json-transform is always available. But scoped packages get fewer organic discoveries.

The prepublishOnly Trap

I had one embarrassing incident where I published a package with a broken build. The prepublishOnly script ran tsc, TypeScript threw errors, but npm published anyway because the dist folder still had stale artifacts from a previous build.

The fix: add rm -rf dist before tsc in your build script.

{
  "scripts": {
    "build": "rm -rf dist && tsc",
    "prepublishOnly": "npm run build"
  }
}
Enter fullscreen mode Exit fullscreen mode

Version Management

When you're iterating fast, you'll publish patch versions constantly. I wrote a small script to bump versions across all tools:

for dir in tools/*/; do
  cd "$dir"
  npm version patch --no-git-tag-version
  cd ../..
done
Enter fullscreen mode Exit fullscreen mode

The --no-git-tag-version flag is crucial — without it, npm tries to create a git tag for each version bump, and your repo history becomes unnavigable.


The Honest Truth: Download Numbers

Let me share real numbers, because most "I published X packages" articles conveniently skip this part.

After one week:

Category Tools Total Downloads Avg per Tool
Git/GitHub 8 47 ~6
API/Network 6 31 ~5
File/System 5 18 ~4
Dev Utilities 10 89 ~9
AI/Automation 6 156 ~26
Data/Format 7 42 ~6
Misc 8 23 ~3
Total 50 406 ~8

The AI tools got the most traction — token-counter-cli alone accounts for about 80 of those downloads. Everything in the "Misc" category is essentially dead on arrival.

The mean is 8 downloads per tool. The median is 4. The mode is probably 1 (which is just me testing the install).

This is the reality of npm. There are 2.5 million packages on the registry. Your brilliant new CLI tool is competing with 2,499,999 others for attention.


Revenue: The $0 Report

Let's talk money, since that's what everyone really wants to know.

Direct npm revenue: $0. npm doesn't have a monetization model for package authors.

Gumroad listings: $0. I listed 20 of the tools on Gumroad with "pay what you want" pricing. Zero purchases so far.

Article revenue: $0 directly, but the articles I've written about these tools have generated traffic to my GitHub profile, which has led to conversations about freelance work.

Bounty hunting: This is where the tools actually pay off indirectly. Tools like ghbounty help me find bounties faster. I haven't won one yet from this batch (Expensify moves fast and competition is fierce), but the tooling makes me faster.

Total revenue from 50 npm packages: $0.

I'm not bitter about this. The goal was never to make money from the tools themselves. It was to:

  1. Build a public portfolio of shipped work
  2. Develop a system for rapid tool creation
  3. Learn what it takes to ship at scale
  4. Generate content (like this article) from the experience

By those metrics, the experiment is a success.


What I'd Do Differently

If I could restart this experiment with what I know now, here's what I'd change:

1. Focus on 5 Great Tools Instead of 50 Okay Ones

The assembly line metaphor is seductive but flawed. A CLI tool isn't a physical product — it's software that needs maintenance, bug fixes, and user support. I now have 50 repositories that will slowly rot unless I actively maintain them.

Five polished, well-documented tools with proper test suites would have been more valuable than 50 MVPs.

2. Invest in Discovery, Not Just Publishing

Publishing is the easy part. Getting anyone to know your tool exists is the hard part. I should have spent less time on tools 30-50 and more time writing detailed blog posts, creating demo GIFs, and posting on relevant subreddits for tools 1-10.

3. Add Tests From the Start

None of my 50 tools have test suites. This is a confession, not a recommendation. When you're moving fast, tests feel like friction. But now I have bugs I can't confidently fix because I have no regression safety net.

4. Use a Monorepo

Managing 50 separate repositories is a nightmare. A monorepo with a tool like Turborepo or Nx would have made publishing, testing, and dependency management dramatically simpler.

5. Build What People Ask For

My best-performing tools (the AI/token ones) succeeded because people are actively searching for solutions to those problems. My worst-performing tools (ASCII art generator, Pomodoro timer) failed because nobody is searching for another one of those.

Build for demand, not for your imagination.


Lessons for Anyone Trying This

The meta-lesson: systems beat willpower

I couldn't have built 50 tools through pure determination. The template system, the publishing scripts, the category-based approach — these removed the cognitive overhead and let me focus on the creative part (the actual tool logic).

If you want to ship more, build systems that make shipping frictionless.

Ship imperfect things

Every single one of these tools could be better. Some of them are embarrassingly minimal. But they're published. They're usable. And publishing teaches you things that perfecting never will.

The npm ecosystem is both a blessing and a curse

Publishing to npm is ridiculously easy. npm publish and you're live. But that ease means the registry is flooded with packages, and discoverability is essentially zero unless you market externally.

Solo maintenance doesn't scale

I'm one person with 50 packages. When someone opens an issue on tool #37, I have to context-switch from whatever I'm working on, remember how that tool works, and fix it. This is unsustainable. If I were doing this again, I'd explicitly mark most tools as "unmaintained — PRs welcome" from day one.

The portfolio effect is real

Despite the $0 revenue, my GitHub profile now shows consistent activity, a range of technical skills, and the ability to ship. Multiple people have reached out about freelance work after seeing the tools. The ROI isn't direct, but it's real.


The Numbers, Summarized

Metric Value
Total packages published 50
Time to publish all ~48 hours
Average time per tool ~45 minutes
External dependencies (avg) 0.3 per tool
Total downloads (week 1) 406
Revenue $0
Tools I actually use daily 4
Tools with >10 downloads 7
Tools I'm proud of ~10
Tools I'd recommend to others ~5

What's Next

I'm not publishing another 50. I'm going to pick the 5 best-performing tools and invest real effort into them — proper documentation, test suites, demo videos, and community building.

The assembly line was a great experiment. It taught me how to ship fast, how npm works at scale, and what the market actually wants. But the next phase is about depth, not breadth.

If you want to try something similar, start with 5 tools, not 50. Build the system, but use it to make fewer, better things.

And if you do publish 50 packages in a weekend, at least you'll have a good story to tell.


All 50 tools are available on my npm profile and GitHub. The ones I'd actually recommend installing: websnap-reader, ghbounty, token-counter-cli, repo-readme-gen, and env-validator.


Wilson Xu is a developer who builds CLI tools, hunts bounties, and occasionally publishes 50 npm packages in a weekend. You can find him on GitHub or Dev.to.

Top comments (0)