Last week, I challenged myself to build and publish five CLI tools on npm in seven days. Not toy projects or hello-world wrappers -- real tools that solve real problems I kept running into as a developer. The constraint forced me to ship fast, cut scope ruthlessly, and focus on the one thing each tool needed to do well.
Here they are: a web scraper that outputs clean Markdown, a GitHub repo health checker, a dependency auditor, a GitHub profile analyzer, and a URL metadata extractor. Each one is open source, installable with a single npm install -g, and designed to play nicely with pipes, jq, and CI/CD workflows.
Let me walk you through all five.
1. websnap-reader -- Reader Mode for Your Terminal
The problem: I read a lot of technical articles, and I wanted a way to grab article content from URLs without the ads, nav bars, cookie banners, and JavaScript junk. I also wanted to pipe article text into other tools -- LLMs, search indexes, note-taking scripts -- and needed clean Markdown, not raw HTML.
Install:
npm install -g websnap-reader
Usage examples:
# Convert any webpage to clean Markdown
websnap https://paulgraham.com/greatwork.html
# Get a structured JSON object with title, word count, reading time
websnap https://arxiv.org/abs/2301.00001 --json
# AI-powered 3-sentence summary (supports OpenAI, Anthropic, Ollama)
websnap https://news.ycombinator.com/item?id=12345 --summary
# Batch process a list of URLs
websnap batch urls.txt --outdir ./articles
Key features:
- Outputs clean Markdown by stripping ads, navigation, and clutter
- Chrome CDP integration for JavaScript-heavy SPAs and login-required pages
- AI-powered summaries with support for OpenAI, Anthropic, and local Ollama models
- Structured JSON output with title, author, date, word count, and reading time
- Batch processing with configurable delays between requests
- Pipe-friendly -- works great with
jq,glow,pbcopy, or any LLM CLI
The Chrome CDP feature was the hardest part. If you already have Chrome open with remote debugging enabled, websnap automatically detects it and can scrape pages that require JavaScript rendering or even authentication. It falls back to plain HTTP when Chrome isn't available.
2. @chengyixu/gitpulse -- Instant Health Check for Any GitHub Repo
The problem: Before depending on an open source library, you should check if it's actually maintained. But going to GitHub and manually checking commit frequency, PR response times, and contributor diversity is tedious. I wanted one command that answers: "Should I depend on this repo?"
Install:
npm install -g @chengyixu/gitpulse
Usage examples:
# Full health report with activity score, bus factor, response times
gitpulse facebook/react
# Compact one-liner for quick comparisons
gitpulse vercel/next.js --compact
# => next.js Active & Healthy Score: 91/100 Bus: 12 Stars: 128k
# Audit your entire stack in one loop
for repo in facebook/react vercel/next.js prisma/prisma; do
gitpulse $repo --compact
done
# JSON output for CI/CD health gates
SCORE=$(gitpulse some/repo --json | jq '.activity.activityScore')
if [ "$SCORE" -lt 20 ]; then
echo "WARNING: dependency repo has low activity"
fi
Key features:
- Activity score (0-100) based on commits, PRs, and issues from the last 30 days
- Bus factor analysis -- how many contributors produce 80% of commits
- Median first-response time and PR merge time
- Clear verdict system: Active & Healthy, Slowing Down, Declining, Abandoned, etc.
- Works without authentication (but supports
GITHUB_TOKENfor higher rate limits) - Accepts both
owner/repoformat and full GitHub URLs
The verdict system is what makes this tool useful. Instead of throwing raw numbers at you, it synthesizes everything into a one-line assessment. An "Active but Understaffed" verdict on a repo with 500 stars and a bus factor of 1 tells you a lot more than just seeing the commit count.
3. @chengyixu/depcheck-ai -- Smart Dependency Auditing
The problem: npm audit only catches known CVEs. npm outdated only shows version diffs. Neither tells you which outdated packages are actually risky, which are deprecated and need replacing, or what the safest upgrade path is. I wanted a single command that combines vulnerability scanning, outdated detection, and deprecation warnings with intelligent risk scoring.
Install:
npm install -g @chengyixu/depcheck-ai
Usage examples:
# Scan the current project
depcheck-ai
# JSON output for CI/CD pipelines
depcheck-ai --json
# Generate an HTML report
depcheck-ai --html report.html
# Only check production dependencies
depcheck-ai --prod
# GitHub Actions integration
# - name: Dependency audit
# run: npx @chengyixu/depcheck-ai --json > audit.json
Key features:
- Combines vulnerability scanning, outdated detection, and deprecation alerts in one tool
- Risk scoring (safe / low / medium / high / critical) that weighs multiple factors
- Smart upgrade suggestions -- distinguishes safe patch updates from major version bumps needing review
- HTML report generation for sharing with your team
- CI/CD exit codes: 0 = healthy, 1 = warning, 2 = critical vulnerabilities
- Parallel scanning -- checks 8 packages concurrently for speed
The risk scoring is what sets this apart. A package being three minor versions behind is not the same risk as a package with a known prototype pollution CVE. depcheck-ai weighs vulnerability severity, how outdated the package is, whether it's deprecated, and whether it's a direct or transitive dependency.
4. ghprofile-stats -- GitHub Profile Analyzer and Comparison Tool
The problem: I wanted a quick way to see a GitHub user's stats -- total stars, language breakdown, activity patterns -- without opening a browser. I also wanted to compare profiles side by side for team assessments or just satisfying curiosity.
Install:
npm install -g ghstats-cli
Usage examples:
# Full profile dashboard with repos, languages, activity
ghstats chengyixu
# Inspect a specific repository
ghstats facebook/react
# Side-by-side comparison of multiple profiles
ghstats compare chengyixu torvalds sindresorhus
# JSON output for scripting
ghstats chengyixu --json > profile.json
ghstats repo facebook/react --json | jq '.stars'
Key features:
- Beautiful CLI dashboard with profile stats, top repos, and language breakdown
- Visual bar chart of language distribution across all repositories
- Repository inspector with stars, forks, watchers, license, and topics
- Side-by-side profile comparison in a single table
- Recent activity feed covering pushes, PRs, issues, and stars from the last 90 days
- Auto-detection -- pass a username,
owner/repo, or full GitHub URL
The comparison feature is surprisingly useful. Running ghstats compare with two candidates' GitHub usernames gives you an instant visual comparison of their open source activity, language preferences, and engagement levels. Great for portfolio reviews.
5. urlmeta-cli -- URL Metadata Extractor and SEO Scorer
The problem: When building features that show link previews, I kept needing to check what Open Graph tags, Twitter Cards, and meta descriptions a page had. I also wanted a quick SEO sanity check. Instead of opening Chrome DevTools and digging through the <head> tag every time, I built a CLI for it.
Install:
npm install -g urlmeta-cli
Usage examples:
# Full metadata report -- basic, OG, Twitter Card, SEO score
urlmeta https://github.com
# JSON output for programmatic use
urlmeta https://github.com --json
# Batch analyze multiple URLs with summary table
urlmeta https://github.com https://npmjs.com https://dev.to --summary
# Quick SEO comparison of competitor pages
urlmeta https://mysite.com https://competitor.com --summary
Key features:
- Extracts basic metadata, Open Graph, Twitter Card, Schema.org JSON-LD, and technical headers
- SEO scoring (0-100) with letter grade and actionable recommendations
- Checks title length, description length, OG completeness, heading structure, viewport, favicon, and more
- Batch mode for analyzing multiple URLs with a comparison summary table
- Response time measurement for each URL
- Works great in CI/CD for automated SEO regression testing
The SEO scoring checks things like: is your title between 30-60 characters? Do you have og:image set? Is there exactly one H1? Do you have a canonical URL? It lists specific issues with recommendations, which makes it genuinely useful rather than just a number.
Common Patterns Across All 5 Tools
Building five CLI tools in quick succession forced me to converge on a shared toolkit and architecture. Here are the patterns I used in every one:
The Core Stack
-
commander for argument parsing. It handles subcommands, options with defaults, help text generation, and version flags. Every tool starts with a
program.option()chain. - chalk for colored terminal output. Color makes a huge difference in readability -- green for healthy, red for critical, yellow for warnings.
- cli-table3 for formatted tables. When you have structured data (package versions, repo stats, URL metadata), tables make it scannable.
Design Principles
Always support
--json. Every tool outputs structured JSON when asked. This makes them composable -- you can pipe tojq, feed into other scripts, or use in CI/CD pipelines.Zero config by default. Every tool works with zero setup. Authentication tokens are optional (for higher rate limits), not required.
Meaningful exit codes. Exit code 0 means success/healthy. Non-zero means something needs attention. This makes CI/CD integration trivial.
Pipe-friendly. All tools write structured output to stdout and status/progress to stderr. This means
tool --json | jq '.field'always works correctly.npx-friendly. Every tool works with
npxso people can try it without installing globally.
Lessons Learned Publishing to npm
1. Naming is harder than coding
My first choice of package name was taken for three out of five tools. npm's namespace is crowded. Scoped packages (@username/package) give you a guaranteed namespace but are less discoverable. Unscoped names are prime real estate.
2. Your README is your landing page
On npm, the README is the product page. I spent almost as much time on each README as on the code itself. A good demo section with realistic output is worth more than pages of API documentation.
3. Publish early, iterate fast
I published v1.0.0 of each tool as soon as the core feature worked. Perfectionism is the enemy of shipping. Users give you better feedback than your imagination does.
4. Test the npx experience
Many people will first try your tool with npx. Make sure the first-run experience is fast and doesn't dump confusing output. The bin field in package.json needs to be correct, and your entry point needs the #!/usr/bin/env node shebang.
5. Exit codes matter for adoption
The difference between a toy CLI and a production CLI is often just proper exit codes. Tools that return meaningful exit codes get used in CI/CD pipelines, which drives sustained usage.
Try Them Out
All five tools are open source under MIT and published on npm:
| Tool | Install | GitHub |
|---|---|---|
| websnap-reader | npm i -g websnap-reader |
chengyixu/websnap |
| @chengyixu/gitpulse | npm i -g @chengyixu/gitpulse |
chengyixu/gitpulse |
| @chengyixu/depcheck-ai | npm i -g @chengyixu/depcheck-ai |
chengyixu/depcheck-ai |
| ghstats-cli | npm i -g ghstats-cli |
chengyixu/ghstats-cli |
| urlmeta-cli | npm i -g urlmeta-cli |
chengyixu/urlmeta-cli |
If any of these solve a problem for you, I'd love a star on GitHub. If you find a bug or have a feature request, open an issue -- I'm actively maintaining all five.
What CLI tools have you built recently? Drop a comment -- I'm always curious what problems other developers are solving from the terminal.
Top comments (0)