<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: phantomDev</title>
    <description>The latest articles on DEV Community by phantomDev (@varraphael).</description>
    <link>https://dev.to/varraphael</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/varraphael"/>
    <language>en</language>
    <item>
      <title>I Built a Web Crawler That Scraped Cloudflare Past Their Own Protection</title>
      <dc:creator>phantomDev</dc:creator>
      <pubDate>Sun, 15 Mar 2026 00:07:08 +0000</pubDate>
      <link>https://dev.to/varraphael/i-built-a-web-crawler-that-scraped-cloudflare-past-their-own-protection-3ld8</link>
      <guid>https://dev.to/varraphael/i-built-a-web-crawler-that-scraped-cloudflare-past-their-own-protection-3ld8</guid>
      <description>&lt;p&gt;Last week I pointed my crawler at &lt;code&gt;cloudflare.com&lt;/code&gt;, set the depth to 1, and walked away.&lt;/p&gt;

&lt;p&gt;When I came back this is what I saw:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;✓ (layer1) https://cloudflare.com
✓ (layer1) https://cloudflare.com/en-gb
✓ (layer1) https://cloudflare.com/de-de
✓ (layer1) https://cloudflare.com/products
✓ (layer1) https://cloudflare.com/developer-platform
...

Total crawled : 100+
Failed        : 0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here is a proof&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbzhdpqgm4n2dn1q7epzu.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbzhdpqgm4n2dn1q7epzu.jpg" alt="CloudFlare Scraped"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A website protected by Cloudflare, scraped past Cloudflare. 100+ pages. Zero blocks. All Layer 1 meaning not even a headless browser was involved.&lt;/p&gt;

&lt;p&gt;That's PhantomCrawl. And this is how it works.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/var-raphael/PhantomCrawl/tree/main/scraped_data_real_samples" rel="noopener noreferrer"&gt;The actual scraped data is in the repo if you want to see it.&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Why Scrapers Keep Breaking
&lt;/h2&gt;

&lt;p&gt;Most developers write a Python script with &lt;code&gt;requests&lt;/code&gt;, run it against a real site, and get a Cloudflare challenge page or a 403 back. So they switch to Puppeteer or Playwright. That works for a while, then sites start detecting headless Chrome too.&lt;/p&gt;

&lt;p&gt;The part nobody talks about is TLS fingerprinting.&lt;/p&gt;

&lt;p&gt;When your Go or Python HTTP client connects to a server it sends a TLS handshake. That handshake has a unique fingerprint - specific cipher suites, extensions, and ordering that identify the library you're using. A Go &lt;code&gt;net/http&lt;/code&gt; client looks nothing like Chrome at the TLS layer. Cloudflare checks this fingerprint before it even looks at your User-Agent or cookies.&lt;/p&gt;

&lt;p&gt;Changing your User-Agent header does nothing if your TLS fingerprint is screaming "I am a Python script."&lt;/p&gt;

&lt;p&gt;PhantomCrawl fixes this at the transport level using &lt;strong&gt;utls HelloChrome_120&lt;/strong&gt; - the exact same TLS fingerprint as a real Chrome 120 browser. To Cloudflare's infrastructure, every request looks like it came from a real person on Windows Chrome. Because at the cryptographic handshake level, it genuinely does.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Problem With Existing Tools
&lt;/h2&gt;

&lt;p&gt;Before I built PhantomCrawl I looked at everything that existed. Here's the honest picture.&lt;/p&gt;

&lt;h3&gt;
  
  
  Feature Comparison
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Tool&lt;/th&gt;
&lt;th&gt;Self-Hosted&lt;/th&gt;
&lt;th&gt;Anti-Bot&lt;/th&gt;
&lt;th&gt;TLS Fingerprint&lt;/th&gt;
&lt;th&gt;AI Cleaning&lt;/th&gt;
&lt;th&gt;Binary&lt;/th&gt;
&lt;th&gt;Free&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;PhantomCrawl&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅ HelloChrome_120&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Firecrawl&lt;/td&gt;
&lt;td&gt;❌ API only&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;500 pages/mo&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Scrapy&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Apify&lt;/td&gt;
&lt;td&gt;❌ Cloud only&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;Limited&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;BeautifulSoup&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Puppeteer&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;⚠️ Detectable&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;ScrapingBee&lt;/td&gt;
&lt;td&gt;❌ API only&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;1,000 req/mo&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  What It Actually Costs
&lt;/h3&gt;

&lt;p&gt;This is where things get uncomfortable for the existing tools.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Tool&lt;/th&gt;
&lt;th&gt;Free Tier&lt;/th&gt;
&lt;th&gt;Paid Entry&lt;/th&gt;
&lt;th&gt;100K pages/mo&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;PhantomCrawl&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Unlimited&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;$0&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;$0&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Firecrawl&lt;/td&gt;
&lt;td&gt;500 pages&lt;/td&gt;
&lt;td&gt;$16/mo&lt;/td&gt;
&lt;td&gt;$83/mo&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Apify&lt;/td&gt;
&lt;td&gt;~$5 credit&lt;/td&gt;
&lt;td&gt;$29/mo&lt;/td&gt;
&lt;td&gt;~$123/mo&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;ScrapingBee&lt;/td&gt;
&lt;td&gt;1,000 req&lt;/td&gt;
&lt;td&gt;$49/mo&lt;/td&gt;
&lt;td&gt;$249+/mo&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Bright Data&lt;/td&gt;
&lt;td&gt;100 records&lt;/td&gt;
&lt;td&gt;$500+/mo&lt;/td&gt;
&lt;td&gt;$1,000+/mo&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Browserless&lt;/td&gt;
&lt;td&gt;6hr/mo&lt;/td&gt;
&lt;td&gt;$29/mo&lt;/td&gt;
&lt;td&gt;Pay per hour&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;PhantomCrawl is $0 because it runs on your machine. You are not paying for someone else's servers. The only optional cost is Groq for AI cleaning - which gives you 100,000 tokens free per day, enough for hundreds of pages at zero cost.&lt;/p&gt;

&lt;p&gt;Scale to 1 million pages a month. Still $0. You just need a machine and an internet connection.&lt;/p&gt;




&lt;h2&gt;
  
  
  How PhantomCrawl Works
&lt;/h2&gt;

&lt;p&gt;PhantomCrawl uses a 4-layer escalation engine. Every URL starts at Layer 1 and only moves up if needed. Most sites never leave Layer 1.&lt;/p&gt;

&lt;h3&gt;
  
  
  Layer 1 - Direct HTTP + TLS Fingerprinting
&lt;/h3&gt;

&lt;p&gt;The fastest method. A direct HTTP request using &lt;strong&gt;utls HelloChrome_120&lt;/strong&gt; to disguise the TLS handshake as real Chrome. Includes human-like headers (&lt;code&gt;Sec-Fetch-*&lt;/code&gt;, &lt;code&gt;Sec-Ch-Ua-*&lt;/code&gt;), jittered timing, and user agent rotation.&lt;/p&gt;

&lt;p&gt;Covers roughly 90% of the web. SSR sites, static pages, Next.js, WordPress, and yes - Cloudflare-protected sites.&lt;/p&gt;

&lt;h3&gt;
  
  
  Layer 2 - Network Hijacking
&lt;/h3&gt;

&lt;p&gt;If Layer 1 gets HTML but the content is not useful, Layer 2 inspects the raw HTML for embedded data. It scans for &lt;code&gt;window.__NEXT_DATA__&lt;/code&gt;, &lt;code&gt;window.__INITIAL_STATE__&lt;/code&gt;, &lt;code&gt;window.__NUXT__&lt;/code&gt;, JSON-LD structured data, and API endpoint patterns.&lt;/p&gt;

&lt;p&gt;Many modern SPAs ship their data pre-embedded in the HTML before JavaScript even runs. Layer 2 extracts it directly without a browser.&lt;/p&gt;

&lt;h3&gt;
  
  
  Layer 2.5 - XHR/Fetch Interception
&lt;/h3&gt;

&lt;p&gt;This is what makes PhantomCrawl different from anything else out there. Instead of scraping the rendered HTML from a headless browser, Layer 2.5 intercepts the actual API responses the browser receives during page load - the raw JSON from XHR and fetch calls.&lt;/p&gt;

&lt;p&gt;The result is clean structured data with zero boilerplate. No parsing noise. No nav menus or footers. Just the data the page was going to display anyway, captured before it becomes HTML.&lt;/p&gt;

&lt;h3&gt;
  
  
  Layer 3 - Full Headless Browser
&lt;/h3&gt;

&lt;p&gt;Last resort. Launches a real browser - go-rod if Chrome is installed locally, or Browserless via API - and fully renders the page with JavaScript execution. Handles the most complex SPAs.&lt;/p&gt;

&lt;p&gt;You never configure which layer to use. PhantomCrawl decides based on what each site actually returns.&lt;/p&gt;




&lt;h2&gt;
  
  
  Every Feature
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;4-layer escalation engine with automatic fallback&lt;/li&gt;
&lt;li&gt;TLS fingerprinting via utls HelloChrome_120&lt;/li&gt;
&lt;li&gt;AI content cleaning via Groq or OpenAI with chunked processing&lt;/li&gt;
&lt;li&gt;Rate limit retry with automatic backoff and resume&lt;/li&gt;
&lt;li&gt;Proxy rotation tunneled at TCP level through the TLS transport&lt;/li&gt;
&lt;li&gt;Depth crawling with per-parent link limits&lt;/li&gt;
&lt;li&gt;Fragment URL deduplication (&lt;code&gt;/#about&lt;/code&gt; and &lt;code&gt;/&lt;/code&gt; are the same page)&lt;/li&gt;
&lt;li&gt;Asset filtering - PDFs, images, and zips skipped during depth crawling&lt;/li&gt;
&lt;li&gt;SQLite state tracking with full resume if interrupted&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;.env&lt;/code&gt; key management with &lt;code&gt;$VAR_NAME&lt;/code&gt; references&lt;/li&gt;
&lt;li&gt;Config generator UI - no terminal needed to configure&lt;/li&gt;
&lt;li&gt;Cross-platform binaries for Linux, Mac (Intel + Apple Silicon), Windows, ARM, and Termux&lt;/li&gt;
&lt;li&gt;Structured JSON output - &lt;code&gt;raw.json&lt;/code&gt; and &lt;code&gt;cleaned.json&lt;/code&gt; per page&lt;/li&gt;
&lt;li&gt;Absolute URL resolution on all extracted links&lt;/li&gt;
&lt;li&gt;Single binary under 20MB, no runtime required&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  The Sleep and Scrape Workflow 😴
&lt;/h2&gt;

&lt;p&gt;Here is something nobody talks about with web scraping at scale. It takes time. Sites throttle you, rate limits kick in, AI cleaning queues up. Trying to babysit this in real time is exhausting and pointless.&lt;/p&gt;

&lt;p&gt;So don't. This is literally the workflow:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# 1. Put your URLs in urls.txt&lt;/span&gt;
&lt;span class="c"&gt;# 2. Run it&lt;/span&gt;
phantomcrawl start
&lt;span class="c"&gt;# 3. Go to sleep&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Wake up to this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Total crawled : 847
Failed        : 0
AI cleaned    : 847
Clean pending : 0
Output        : ~/phantomcrawl/scraped
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;PhantomCrawl batches requests with randomized delays so your timing is never predictable. It retries failures with exponential backoff. If the AI token quota resets overnight, it resumes exactly where it left off. Nothing gets crawled twice.&lt;/p&gt;

&lt;p&gt;Put your URLs in. Go to sleep. Wake up to a folder full of clean JSON. ☕&lt;/p&gt;




&lt;h2&gt;
  
  
  Getting Started
&lt;/h2&gt;

&lt;p&gt;Download the binary for your platform from &lt;a href="https://github.com/var-raphael/PhantomCrawl/releases" rel="noopener noreferrer"&gt;GitHub Releases&lt;/a&gt;:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Platform&lt;/th&gt;
&lt;th&gt;Binary&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Linux 64-bit&lt;/td&gt;
&lt;td&gt;&lt;code&gt;phantomcrawl-linux-amd64&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Linux ARM / Termux&lt;/td&gt;
&lt;td&gt;&lt;code&gt;phantomcrawl-linux-arm64&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;macOS Apple Silicon&lt;/td&gt;
&lt;td&gt;&lt;code&gt;phantomcrawl-darwin-arm64&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;macOS Intel&lt;/td&gt;
&lt;td&gt;&lt;code&gt;phantomcrawl-darwin-amd64&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Windows&lt;/td&gt;
&lt;td&gt;&lt;code&gt;phantomcrawl-windows-amd64.exe&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Linux/Mac&lt;/span&gt;
&lt;span class="nb"&gt;chmod&lt;/span&gt; +x phantomcrawl-linux-amd64
&lt;span class="nb"&gt;sudo mv &lt;/span&gt;phantomcrawl-linux-amd64 /usr/local/bin/phantomcrawl

&lt;span class="c"&gt;# Three commands to your first crawl&lt;/span&gt;
phantomcrawl init
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"https://example.com"&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; urls.txt
phantomcrawl start
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Want AI cleaning? Get a free Groq key at &lt;a href="https://console.groq.com" rel="noopener noreferrer"&gt;console.groq.com&lt;/a&gt;, add it to &lt;code&gt;.env&lt;/code&gt;, and set &lt;code&gt;"ai": { "enabled": true }&lt;/code&gt; in &lt;code&gt;crawl.json&lt;/code&gt;. That's it.&lt;/p&gt;

&lt;p&gt;Full docs at &lt;a href="https://phantomcrawl.vercel.app" rel="noopener noreferrer"&gt;phantomcrawl.vercel.app&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Why It's Under 20MB
&lt;/h2&gt;

&lt;p&gt;PhantomCrawl is written in Go and compiled to a single static binary. No runtime, no package manager, no &lt;code&gt;node_modules&lt;/code&gt; folder that somehow weighs 300MB. Everything is included - the crawler, the AI pipeline, the API server, the SQLite driver, and the config system.&lt;/p&gt;

&lt;p&gt;The binary is about 14MB. A fresh Next.js project's &lt;code&gt;node_modules&lt;/code&gt; is 50MB before you've written a line of code. Go was the right choice for this.&lt;/p&gt;

&lt;p&gt;Cross-compiling from a phone running Termux on Android to Linux amd64, macOS arm64, and Windows amd64 is one command per platform. Try doing that with Python.&lt;/p&gt;




&lt;h2&gt;
  
  
  One More Thing
&lt;/h2&gt;

&lt;p&gt;I'm Raphael, 18, from Lagos, Nigeria. I started coding at 12 on a phone with 1GB of RAM. No laptop, no bootcamp, no one teaching me. PhantomCrawl is my 7th shipped product.&lt;/p&gt;

&lt;p&gt;I built it because I needed it and nothing else did the job without either costing money or breaking on any site worth scraping.&lt;/p&gt;

&lt;p&gt;If it helps you, a star on the repo means a lot.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/var-raphael/PhantomCrawl" rel="noopener noreferrer"&gt;github.com/var-raphael/PhantomCrawl&lt;/a&gt;&lt;/p&gt;

</description>
      <category>go</category>
      <category>webdev</category>
      <category>opensource</category>
      <category>programming</category>
    </item>
    <item>
      <title>I Built a Web Analytics Tool That Needs Zero Signup. One Script Tag and You're Live in 30 Seconds</title>
      <dc:creator>phantomDev</dc:creator>
      <pubDate>Tue, 03 Mar 2026 11:04:33 +0000</pubDate>
      <link>https://dev.to/varraphael/i-built-a-web-analytics-tool-that-needs-zero-signup-one-script-tag-and-youre-live-in-30-seconds-4phd</link>
      <guid>https://dev.to/varraphael/i-built-a-web-analytics-tool-that-needs-zero-signup-one-script-tag-and-youre-live-in-30-seconds-4phd</guid>
      <description>&lt;p&gt;I was building a side project last year and needed to track visitors.&lt;/p&gt;

&lt;p&gt;Google Analytics felt like overkill. 15 minutes of setup, a privacy policy update, cookie consent banners, GDPR configuration, and I still wasn't sure if my data was being used to target ads somewhere. For a side project. That maybe 50 people would visit.&lt;/p&gt;

&lt;p&gt;Plausible was clean but $9/month for something I wasn't even monetizing yet felt wrong. Fathom was $14/month. Same problem.&lt;/p&gt;

&lt;p&gt;So I did what developers do when nothing fits.&lt;/p&gt;

&lt;p&gt;I built my own.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl3mwuc13h7cvwz4ghdug.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl3mwuc13h7cvwz4ghdug.jpg" alt="Phantomtrack dashboard" width="720" height="1600"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz3oawc8r7ksm96e672h8.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz3oawc8r7ksm96e672h8.jpg" alt="Phantomtrack dashboard" width="720" height="1600"&gt;&lt;/a&gt;&lt;/p&gt;


&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;PhantomTrack&lt;/strong&gt; is a privacy-first web analytics tool with no signup, no cookies, and a 30-second setup.&lt;/p&gt;

&lt;p&gt;You generate a tracking ID instantly (no account required, no email), drop one script tag into your HTML, and your dashboard is live. That's it.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;script 
  &lt;/span&gt;&lt;span class="na"&gt;src=&lt;/span&gt;&lt;span class="s"&gt;"https://phantomtrack-cdn.vercel.app/phantom.js"&lt;/span&gt; 
  &lt;span class="na"&gt;data-track-id=&lt;/span&gt;&lt;span class="s"&gt;"YOUR_TRACK_ID"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/script&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;One line. Done.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Features That Actually Matter
&lt;/h2&gt;

&lt;h3&gt;
  
  
  No Signup Required
&lt;/h3&gt;

&lt;p&gt;This is the one I'm most proud of. Every analytics tool makes you create an account before you can track a single pageview. PhantomTrack generates your tracking ID instantly. No account, no email, no friction. You're collecting data before most tools have even sent you a verification email.&lt;/p&gt;

&lt;h3&gt;
  
  
  No Cookies. Zero.
&lt;/h3&gt;

&lt;p&gt;PhantomTrack doesn't use cookies at all. No cookie consent banners. No GDPR headaches. No annoying popups destroying your UI because a lawyer said so. We only track what matters: pageviews and traffic sources, without storing personal data.&lt;/p&gt;

&lt;h3&gt;
  
  
  Works on SPAs (React, Next.js, Vue)
&lt;/h3&gt;

&lt;p&gt;This one took the most work to get right. Standard analytics scripts track page loads. SPAs don't reload the page on navigation. They swap content dynamically. Most lightweight analytics tools miss this entirely.&lt;/p&gt;

&lt;p&gt;PhantomTrack patches &lt;code&gt;history.pushState&lt;/code&gt; under the hood so every route change in your React or Next.js app gets tracked automatically. No extra configuration needed.&lt;/p&gt;

&lt;h3&gt;
  
  
  Everything on One Dashboard
&lt;/h3&gt;

&lt;p&gt;Visitors, unique visitors, pageviews, session duration, geographic data with country tier analysis, device and browser breakdown, traffic sources, real-time trends. All on one screen. No tab switching. No digging through menus.&lt;/p&gt;

&lt;h3&gt;
  
  
  AI Weekly Review
&lt;/h3&gt;

&lt;p&gt;Every week PhantomTrack generates an AI-powered analysis of your traffic patterns: what's growing, what dropped, where your best traffic comes from. Useful when you're too busy shipping to manually analyze charts.&lt;/p&gt;

&lt;h3&gt;
  
  
  Country Tier Analysis
&lt;/h3&gt;

&lt;p&gt;This one is specific to developers building for global audiences. PhantomTrack classifies your visitors by country tier (Tier 1, 2, 3) so you can understand not just where your traffic comes from but what that traffic is worth commercially.&lt;/p&gt;

&lt;h3&gt;
  
  
  7+ Export Formats
&lt;/h3&gt;

&lt;p&gt;CSV, JSON, XML, Excel, PHP array, HTML, plain text. Your data, your way. No lock-in.&lt;/p&gt;

&lt;h3&gt;
  
  
  Three Ways to View Your Data
&lt;/h3&gt;

&lt;p&gt;Web dashboard, iframe embed (drop your analytics directly into your own site or app), or JSON API (pull data programmatically and build whatever you want on top of it).&lt;/p&gt;




&lt;h2&gt;
  
  
  Pricing (And a Lifetime Deal)
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Free:&lt;/strong&gt; 10,000 requests/month. All features included.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pro:&lt;/strong&gt; $3/month. 30,000 requests/month.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Premium:&lt;/strong&gt; $5/month. 60,000 requests/month.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Enterprise:&lt;/strong&gt; $8/month. 100,000 requests/month.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Lifetime:&lt;/strong&gt; $20 one-time. 300,000 requests/month, forever.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The lifetime deal is a launch special. It goes up to $50 after the launch period. If you're building side projects and want analytics without a recurring subscription, this is the one.&lt;/p&gt;




&lt;h2&gt;
  
  
  How It Compares
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Feature&lt;/th&gt;
&lt;th&gt;PhantomTrack&lt;/th&gt;
&lt;th&gt;Google Analytics&lt;/th&gt;
&lt;th&gt;Plausible&lt;/th&gt;
&lt;th&gt;Fathom&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;No signup required&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Setup time&lt;/td&gt;
&lt;td&gt;30 seconds&lt;/td&gt;
&lt;td&gt;15-30 minutes&lt;/td&gt;
&lt;td&gt;5-10 minutes&lt;/td&gt;
&lt;td&gt;5-10 minutes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;No cookies&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;SPA support&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;AI weekly insights&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Country tier analysis&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Iframe embed&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Export formats&lt;/td&gt;
&lt;td&gt;7+&lt;/td&gt;
&lt;td&gt;Limited&lt;/td&gt;
&lt;td&gt;CSV only&lt;/td&gt;
&lt;td&gt;CSV only&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Lifetime deal&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Starting price&lt;/td&gt;
&lt;td&gt;Free&lt;/td&gt;
&lt;td&gt;Free*&lt;/td&gt;
&lt;td&gt;$9/month&lt;/td&gt;
&lt;td&gt;$14/month&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;*Google Analytics is free but uses your data for ad targeting.&lt;/p&gt;




&lt;h2&gt;
  
  
  Try It Live
&lt;/h2&gt;

&lt;p&gt;The docs site itself runs PhantomTrack. You can see the actual live dashboard tracking real visitor data right now at &lt;a href="https://phantomtrack-docs.vercel.app" rel="noopener noreferrer"&gt;phantomtrack-docs.vercel.app&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;That's not a demo with fake numbers. That's real traffic, tracked with the same script tag you'd use on your own site.&lt;/p&gt;




&lt;h2&gt;
  
  
  Get Started
&lt;/h2&gt;

&lt;p&gt;👉 &lt;a href="https://phantomtrack-docs.vercel.app" rel="noopener noreferrer"&gt;phantomtrack-docs.vercel.app&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Generate your tracking ID, drop the script tag, and you'll have your first pageview tracked in under a minute. No account. No credit card. No setup friction.&lt;/p&gt;

&lt;p&gt;If you try it, I'd genuinely love to hear what you think: what's missing, what's confusing, what would make it more useful for your workflow. Drop a comment or find me on X.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Built by Raphael, full stack developer. Also building &lt;a href="https://www.npmjs.com/package/phantomit-cli" rel="noopener noreferrer"&gt;phantomit-cli&lt;/a&gt;, a CLI tool that writes your git commit messages using AI.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>webdev</category>
      <category>analytics</category>
      <category>showdev</category>
    </item>
    <item>
      <title>I built a CLI tool that writes my git commits so I never have to again</title>
      <dc:creator>phantomDev</dc:creator>
      <pubDate>Sat, 21 Feb 2026 11:45:01 +0000</pubDate>
      <link>https://dev.to/varraphael/i-built-a-cli-tool-that-writes-my-git-commits-so-i-never-have-to-again-h21</link>
      <guid>https://dev.to/varraphael/i-built-a-cli-tool-that-writes-my-git-commits-so-i-never-have-to-again-h21</guid>
      <description>&lt;p&gt;&lt;strong&gt;Let me paint you a picture.&lt;/strong&gt;&lt;br&gt;
It's 1am. I've been coding for 4 hours straight. I'm in flow, fixing bugs, adding features, refactoring stuff that was embarrassing to look at. Then I finally decide to push and I stare at the git commit prompt like it personally offended me.&lt;/p&gt;

&lt;p&gt;What do I type? &lt;code&gt;fix stuff&lt;/code&gt;? &lt;code&gt;update&lt;/code&gt;? &lt;code&gt;changes&lt;/code&gt;?&lt;/p&gt;

&lt;p&gt;Yeah. We've all been there.&lt;/p&gt;

&lt;p&gt;The worst part wasn't even the lazy messages. It was that I'd sometimes go &lt;em&gt;days&lt;/em&gt; without pushing because I'd get so deep into coding that committing felt like an interruption. Then I'd look at my GitHub contribution graph and it looked like I hadn't touched a computer in a week. Meanwhile I'd written 2000 lines of code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhepn6vffmypwi05ejn99.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhepn6vffmypwi05ejn99.png" alt=" " width="800" height="1717"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So I did what any reasonable developer would do at 1am — I decided to build a tool to fix it instead of just... committing.&lt;/p&gt;


&lt;h2&gt;
  
  
  💡 The idea
&lt;/h2&gt;

&lt;p&gt;The concept is simple: what if a tool just &lt;em&gt;watched&lt;/em&gt; my code, noticed when I saved files, figured out what changed, and wrote the commit message for me?&lt;/p&gt;

&lt;p&gt;I'd just have to press Y.&lt;/p&gt;

&lt;p&gt;That's it. That's the whole pitch.&lt;/p&gt;

&lt;p&gt;I called it &lt;strong&gt;phantomit&lt;/strong&gt; because it commits silently in the background like a ghost. Also because I thought it was a cool name at 1am and I stand by that decision.&lt;/p&gt;


&lt;h2&gt;
  
  
  ⚙️ How it works
&lt;/h2&gt;

&lt;p&gt;Here's the basic flow:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;You run &lt;code&gt;phantomit watch --on-save&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;It watches your project directory using &lt;strong&gt;chokidar&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;When you save files, it waits 8 seconds (debounce window)&lt;/li&gt;
&lt;li&gt;After the window closes, it runs &lt;code&gt;git add .&lt;/code&gt; then &lt;code&gt;git diff --staged&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;That diff goes to &lt;strong&gt;Groq's llama-3.1-8b-instant&lt;/strong&gt; with a strict prompt&lt;/li&gt;
&lt;li&gt;The AI returns a conventional commit message (10-20 words, nothing more)&lt;/li&gt;
&lt;li&gt;You see this in your terminal:
&lt;/li&gt;
&lt;/ol&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[11:42 PM] ✎ src/auth.ts, ✚ src/middleware.ts

✦ Commit message:
"feat(auth): add JWT validation middleware with token expiry handling"

[Y] commit &amp;amp; push   [E] edit   [N] skip

→
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;You press Y. It commits. It pushes. You go back to coding.&lt;/p&gt;

&lt;p&gt;That's literally it.&lt;/p&gt;


&lt;h2&gt;
  
  
  🐛 The problems I didn't expect
&lt;/h2&gt;

&lt;p&gt;Okay here's the part where it gets interesting. Building the concept took maybe an hour. The &lt;em&gt;edge cases&lt;/em&gt; took the rest of the night.&lt;/p&gt;
&lt;h3&gt;
  
  
  Saving two files felt like saving one and a half
&lt;/h3&gt;

&lt;p&gt;My original debounce was 2 seconds. Save &lt;code&gt;auth.ts&lt;/code&gt;, wait 1.8 seconds, save &lt;code&gt;middleware.ts&lt;/code&gt; — boom, two separate commit triggers. The first one commits, the second one gets skipped because the first is still processing.&lt;/p&gt;

&lt;p&gt;So I increased the debounce to 8 seconds and made it configurable. Now if you save 5 files within 8 seconds they all batch into one commit. One clean diff, one good message.&lt;/p&gt;
&lt;h3&gt;
  
  
  The race condition nobody warned me about
&lt;/h3&gt;

&lt;p&gt;Here's a fun one. What if you delete a file and save another file at &lt;em&gt;exactly&lt;/em&gt; the same time?&lt;/p&gt;

&lt;p&gt;Old behavior: first event triggers, commit starts, second event comes in while &lt;code&gt;isCommitting = true&lt;/code&gt;, gets silently dropped. You just lost a change.&lt;/p&gt;

&lt;p&gt;The fix was a &lt;strong&gt;commit queue&lt;/strong&gt;. If a commit is already running when a new trigger fires, it queues up and runs immediately after. Nothing gets dropped. I felt very smart when I figured this out and then immediately felt dumb for not thinking of it from the start.&lt;/p&gt;
&lt;h3&gt;
  
  
  Chokidar doesn't read .gitignore
&lt;/h3&gt;

&lt;p&gt;This one annoyed me. Chokidar is great for watching files but it has no idea what a &lt;code&gt;.gitignore&lt;/code&gt; is. So by default phantomit would pick up changes in &lt;code&gt;node_modules&lt;/code&gt;, &lt;code&gt;.env&lt;/code&gt;, &lt;code&gt;dist&lt;/code&gt; — all the stuff you'd never want to commit.&lt;/p&gt;

&lt;p&gt;I pulled in the &lt;code&gt;ignore&lt;/code&gt; npm package which parses &lt;code&gt;.gitignore&lt;/code&gt; properly — including negations (&lt;code&gt;!important.js&lt;/code&gt;), globs (&lt;code&gt;**/*.log&lt;/code&gt;), comments, the whole spec. Now phantomit automatically reads your &lt;code&gt;.gitignore&lt;/code&gt; and merges it with whatever extra patterns you define in &lt;code&gt;.phantomit.json&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Your &lt;code&gt;.env&lt;/code&gt; is safe. You're welcome.&lt;/p&gt;
&lt;h3&gt;
  
  
  Only catching "save" events is naive
&lt;/h3&gt;

&lt;p&gt;My first version only listened for &lt;code&gt;change&lt;/code&gt; events — file edits. But what about:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Creating a new file? (&lt;code&gt;add&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Deleting a file? (&lt;code&gt;unlink&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Creating a folder? (&lt;code&gt;addDir&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Deleting a folder? (&lt;code&gt;unlinkDir&lt;/code&gt;)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All of these are meaningful changes that deserve to be in a commit. So I added all 5 event types and now the terminal shows you exactly what happened:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[9:14 AM] ✚ src/newfile.ts, ✖ src/old.ts, ✎ src/index.ts
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Much better context for the AI too — it knows you deleted something, not just edited it.&lt;/p&gt;




&lt;h2&gt;
  
  
  🤖 The AI part
&lt;/h2&gt;

&lt;p&gt;I'm using Groq because it's genuinely fast and has a free tier that's more than enough for personal use. The model is &lt;code&gt;llama-3.1-8b-instant&lt;/code&gt; — small, quick, doesn't overthink it.&lt;/p&gt;

&lt;p&gt;The prompt is strict:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;You are a Git commit message generator.
Rules:
- Use conventional commits format: type(scope): description
- Types: feat, fix, refactor, chore, docs, style, test, perf
- Keep it between 10-20 words
- Be specific and descriptive, not vague
- No period at the end
- Output ONLY the commit message, nothing else
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The "output ONLY the commit message" is doing a lot of work there. Without it you get things like:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"Here is a commit message for your changes: feat(auth)..."&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;No thanks.&lt;/p&gt;

&lt;p&gt;I also truncate diffs over 6000 characters because sending a 50kb diff to an LLM is both slow and expensive and the model doesn't need to read your entire codebase to write a 15 word sentence.&lt;/p&gt;




&lt;h2&gt;
  
  
  🔑 Key rotation (my favorite feature nobody asked for)
&lt;/h2&gt;

&lt;p&gt;Groq's free tier has rate limits. If you're coding heavily you might hit them.&lt;/p&gt;

&lt;p&gt;My solution: support unlimited API keys picked at random on every commit.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;GROQ_API_KEY_1=first_key
GROQ_API_KEY_2=second_key
GROQ_API_KEY_3=third_key
# add as many as you want
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;How it works under the hood:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="nb"&gt;Object&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;entries&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;forEach&lt;/span&gt;&lt;span class="p"&gt;(([&lt;/span&gt;&lt;span class="nx"&gt;key&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;value&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;key&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;startsWith&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;GROQ_API_KEY_&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nx"&gt;value&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="nx"&gt;keys&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;value&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Just loops through every env variable matching the pattern. No hardcoded limit. You could have &lt;code&gt;GROQ_API_KEY_99&lt;/code&gt; and it would still work. Each commit picks one at random.&lt;/p&gt;

&lt;p&gt;Three free Groq accounts = effectively triple the rate limit. I call that a feature.&lt;/p&gt;




&lt;h2&gt;
  
  
  🎛️ Three watch modes
&lt;/h2&gt;

&lt;p&gt;Different devs work differently so I built three modes:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;--on-save&lt;/code&gt;&lt;/strong&gt; — commits 8 seconds after your last file save. This is the one I use personally.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;phantomit watch &lt;span class="nt"&gt;--on-save&lt;/span&gt;
phantomit watch &lt;span class="nt"&gt;--on-save&lt;/span&gt; &lt;span class="nt"&gt;--daemon&lt;/span&gt;  &lt;span class="c"&gt;# background, silent&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;code&gt;--every &amp;lt;minutes&amp;gt;&lt;/code&gt;&lt;/strong&gt; — commits on a fixed interval if there are changes. Set it and forget it.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;phantomit watch &lt;span class="nt"&gt;--every&lt;/span&gt; 30
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;code&gt;--lines &amp;lt;count&amp;gt;&lt;/code&gt;&lt;/strong&gt; — commits when your accumulated diff crosses a line threshold.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;phantomit watch &lt;span class="nt"&gt;--lines&lt;/span&gt; 20
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And if you just want the AI message without any automation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;phantomit push        &lt;span class="c"&gt;# generate + commit right now&lt;/span&gt;
phantomit push &lt;span class="nt"&gt;--mock&lt;/span&gt; &lt;span class="c"&gt;# test without a Groq key&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  🚀 Install it
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-g&lt;/span&gt; phantomit-cli
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note: the package is &lt;code&gt;phantomit-cli&lt;/code&gt; on npm because some package called &lt;code&gt;phantomjs&lt;/code&gt; exists and npm decided our names were too similar, which is honestly a bit offensive. But the command is still just &lt;code&gt;phantomit&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Then:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cd &lt;/span&gt;your-project
phantomit init
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"GROQ_API_KEY=your_key"&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&amp;gt;&lt;/span&gt; .env
phantomit watch &lt;span class="nt"&gt;--on-save&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Get a free Groq key at &lt;a href="https://console.groq.com" rel="noopener noreferrer"&gt;console.groq.com&lt;/a&gt;. Takes about 30 seconds.&lt;/p&gt;




&lt;h2&gt;
  
  
  📸 What the commit messages actually look like
&lt;/h2&gt;

&lt;p&gt;Here are some real ones generated during development of phantomit itself (yes, I used phantomit to build phantomit, which felt very appropriate):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;feat(ai): implement mock mode for generateCommitMessage function
fix(test): Remove test file as it is no longer needed
feat(test): add test generation feature with mock mode support
fix(test): Remove unused test file and refactor generateCommitMessage function
fix(ai): Update AI commit message generation to use new model
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Not perfect but genuinely better than &lt;code&gt;update&lt;/code&gt; or &lt;code&gt;misc changes&lt;/code&gt;. And I typed zero of those.&lt;/p&gt;




&lt;h2&gt;
  
  
  🔮 What's next
&lt;/h2&gt;

&lt;p&gt;A few things I want to add:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Auto-retry with next key if one Groq key hits a rate limit instead of just failing&lt;/li&gt;
&lt;li&gt;Better daemon logging with timestamps and session stats&lt;/li&gt;
&lt;li&gt;Custom prompt templates so you can tune the commit style to your team's conventions&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you want to contribute or have ideas, the repo is open: &lt;a href="https://github.com/var-raphael/phantomit" rel="noopener noreferrer"&gt;github.com/var-raphael/phantomit&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Docs at &lt;a href="https://phantomit-docs.vercel.app" rel="noopener noreferrer"&gt;phantomit-docs.vercel.app&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;That's it. Go install it, stop writing commit messages, and put that mental energy toward literally anything else.&lt;/p&gt;

&lt;p&gt;Your GitHub graph will thank you. 📈&lt;/p&gt;

&lt;p&gt;Raphael, 18, coding on a mobile phone in Nigeria at 1am 🇳🇬&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>github</category>
      <category>devjournal</category>
      <category>buildinpublic</category>
    </item>
  </channel>
</rss>
