<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: The Data Nerd</title>
    <description>The latest articles on DEV Community by The Data Nerd (@data_nerd).</description>
    <link>https://dev.to/data_nerd</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/data_nerd"/>
    <language>en</language>
    <item>
      <title>I wrote a 104-page book on the GitHub signals that predict Series A rounds — free download</title>
      <dc:creator>The Data Nerd</dc:creator>
      <pubDate>Wed, 06 May 2026 14:20:56 +0000</pubDate>
      <link>https://dev.to/data_nerd/i-wrote-a-104-page-book-on-the-github-signals-that-predict-series-a-rounds-free-download-29p5</link>
      <guid>https://dev.to/data_nerd/i-wrote-a-104-page-book-on-the-github-signals-that-predict-series-a-rounds-free-download-29p5</guid>
      <description>&lt;p&gt;Three weeks ago I started turning the methodology behind &lt;a href="https://signals.gitdealflow.com" rel="noopener noreferrer"&gt;GitDealFlow&lt;/a&gt; into a proper trade book. It shipped today. 104 pages, free PDF and EPUB, also fully readable on the open web.&lt;/p&gt;

&lt;p&gt;Here is what is in it and why I wrote it.&lt;/p&gt;

&lt;h2&gt;
  
  
  The premise in one paragraph
&lt;/h2&gt;

&lt;p&gt;Public GitHub data — commit logs, contributor graphs, dependency trees, infrastructure repos — fires three to six weeks before a typical Series A announcement. The signals are computable from a single free REST endpoint, by anyone, on a $0 budget. The seven-signal stack in the book has a 68% hit rate at a 33-day median lead time on the SSRN-indexed panel of 219 Series-A-bound startups. None of it requires a private network, a paid data licence, or a relationship with the founder.&lt;/p&gt;

&lt;h2&gt;
  
  
  The seven signals
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Commit-velocity acceleration&lt;/strong&gt; (14-day window, two-period confirmation, +200%).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Contributor influx&lt;/strong&gt; (4+ new humans in 14 days, 120-day look-back, bot-filtered).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Infrastructure repository buildout&lt;/strong&gt; (Terraform / Helm / runbook / proto / internal-tools).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Star-velocity detachment&lt;/strong&gt; (stars accelerating 3× while commits stay flat — orchestrated attention).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Issue closure cadence&lt;/strong&gt; (median time-to-close, sharply tightening).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Downstream dependency adoption&lt;/strong&gt; (Libraries.io aggregation across npm / PyPI / Maven / crates.io).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Founding-team public visibility&lt;/strong&gt; (engineering blog cadence, conference talks, OSS maintenance).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Each signal chapter has the formal definition, threshold guidance, false-positive patterns, a worked example, and exercises.&lt;/p&gt;

&lt;h2&gt;
  
  
  What's in the appendix
&lt;/h2&gt;

&lt;p&gt;A 90-minute replication walkthrough. From a fresh laptop, a personal access token, and &lt;code&gt;pip install requests&lt;/code&gt; to a verified Scout Score against the live leaderboard. About a hundred lines of Python. Every primitive (paginated commit fetcher, contributor classifier, repo metadata cache) is reusable.&lt;/p&gt;

&lt;p&gt;If you finish the appendix, the methodology is yours — replicable on a $0 budget, indefinitely, no dependence on my servers.&lt;/p&gt;

&lt;h2&gt;
  
  
  Free or €0.99
&lt;/h2&gt;

&lt;p&gt;The book is free in PDF, EPUB, Markdown, and plain text on &lt;a href="https://signals.gitdealflow.com/book" rel="noopener noreferrer"&gt;signals.gitdealflow.com/book&lt;/a&gt;. The €0.99 Kindle copy adds three bonus emails (a worked walkthrough of the most recent Series A catch, the unedited interviews with two developer-investors who use the workflow daily, and a 30-day direct line to me for methodology questions). Same content, different bonus stack.&lt;/p&gt;

&lt;p&gt;ISBN 979-8-9876543-1-7 · CC-BY-4.0 license. The methodology paper is on SSRN at &lt;a href="https://ssrn.com/abstract=6606558" rel="noopener noreferrer"&gt;abstract 6606558&lt;/a&gt;. The dataset is mirrored on Zenodo, Hugging Face, and Kaggle.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why free
&lt;/h2&gt;

&lt;p&gt;Free distribution is the point. Every reader who finds a false-positive pattern reports it back, and the next edition gets better. Readers who get value from the book are the ones who eventually subscribe to the €9.97/mo Dashboard — a book that closes that loop pays for itself in three subscribers.&lt;/p&gt;

&lt;p&gt;Read it on a flight. Skip to the appendix. Run the script. The Monday-morning workflow looks completely different inside a quarter.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://signals.gitdealflow.com/book" rel="noopener noreferrer"&gt;&lt;strong&gt;Get the book →&lt;/strong&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;If you have a methodology question, comment here or reply to any email from &lt;code&gt;signal@gitdealflow.com&lt;/code&gt;. The next edition folds in every reader correction with attribution.&lt;/p&gt;

</description>
      <category>venturecapital</category>
      <category>github</category>
      <category>opensource</category>
      <category>datascience</category>
    </item>
    <item>
      <title>Made my site AI-citable in one day — the .well-known + JSON-LD + llms.txt playbook</title>
      <dc:creator>The Data Nerd</dc:creator>
      <pubDate>Tue, 05 May 2026 19:15:32 +0000</pubDate>
      <link>https://dev.to/data_nerd/made-my-site-ai-citable-in-one-day-the-well-known-json-ld-llmstxt-playbook-9ob</link>
      <guid>https://dev.to/data_nerd/made-my-site-ai-citable-in-one-day-the-well-known-json-ld-llmstxt-playbook-9ob</guid>
      <description>&lt;p&gt;&lt;em&gt;Yesterday I ran a 5-pass AEO/SEO/GEO/AIO audit on the same site, fixed 64 surfaces in one sitting, and watched the composite probe score climb from 70 to 94. This is the dev-tactical playbook of what actually moved the needle, with the exact files and probes.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The premise: traditional SEO (links, meta tags, sitemaps) is necessary but no longer sufficient. AI Overview, ChatGPT, Perplexity, and Claude pull from a different surface area — &lt;code&gt;/.well-known/&lt;/code&gt;, &lt;code&gt;llms.txt&lt;/code&gt;, &lt;code&gt;agent-card.json&lt;/code&gt;, &lt;code&gt;openapi.json&lt;/code&gt;, and structured &lt;code&gt;schema.org&lt;/code&gt; JSON-LD with &lt;strong&gt;Speakable&lt;/strong&gt; + &lt;strong&gt;QAPage&lt;/strong&gt; + &lt;strong&gt;Service&lt;/strong&gt; types.&lt;/p&gt;

&lt;p&gt;If your tool isn't shipping these, you're invisible to half the LLMs that ought to be citing you.&lt;/p&gt;

&lt;h2&gt;
  
  
  The 5-pass audit loop
&lt;/h2&gt;

&lt;p&gt;I ran a single-day chain of:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Probe&lt;/strong&gt; — a checklist of "if I were an LLM scraping for an answer to &lt;em&gt;X&lt;/em&gt;, what file would I open?" — across 7 categories: discovery, schema, content, well-known, structured-Q&amp;amp;A, citations, and identity.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Score&lt;/strong&gt; each category 0–100.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Diff&lt;/strong&gt; the lowest-scoring against the spec.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Ship&lt;/strong&gt; the fixes (mostly small JSON files + JSON-LD blocks + 308→200 redirect cleanups).&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Re-probe.&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Each pass took ~90 minutes. The composite went 70 → 81 → 89 → 92 → 94.&lt;/p&gt;

&lt;h2&gt;
  
  
  What actually moved the score
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Pass 1 (70 → 81): the obvious gaps.&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;/sitemap.xml&lt;/code&gt; was 1,060 URLs but 8% of them 404'd. Fix: regenerate from build manifest, ban orphans.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;/robots.txt&lt;/code&gt; allowed everything; LLMs got noise. Fix: explicit &lt;code&gt;User-agent: GPTBot / ClaudeBot / PerplexityBot&lt;/code&gt; allow blocks for the high-signal paths only.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;Speakable&lt;/code&gt; JSON-LD was missing on every Q&amp;amp;A page. Fix: add &lt;code&gt;cssSelector: ['h1','.tldr']&lt;/code&gt; to every answer page.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Pass 2 (81 → 89): structured Q&amp;amp;A.&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Built 3 new &lt;code&gt;/answers/{slug}&lt;/code&gt; pages with &lt;code&gt;QAPage&lt;/code&gt; + &lt;code&gt;Question&lt;/code&gt; + &lt;code&gt;acceptedAnswer&lt;/code&gt; JSON-LD, evidence-anchored to a public dataset.&lt;/li&gt;
&lt;li&gt;Added &lt;code&gt;agent-card.json&lt;/code&gt; to &lt;code&gt;/.well-known/&lt;/code&gt; describing every machine-readable endpoint.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;openapi.json&lt;/code&gt; exposed: 4 paths → 21 paths. LLMs read this and start citing your API examples in answers.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Pass 3 (89 → 92): the well-known explosion.&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Shipped: &lt;code&gt;/.well-known/openapi.json&lt;/code&gt;, &lt;code&gt;/.well-known/agent-card.json&lt;/code&gt;, &lt;code&gt;/.well-known/agents.json&lt;/code&gt;, &lt;code&gt;/.well-known/llms.txt&lt;/code&gt;, &lt;code&gt;/.well-known/ai-policy.json&lt;/code&gt;, &lt;code&gt;/.well-known/ai.txt&lt;/code&gt;, &lt;code&gt;/.well-known/ai.json&lt;/code&gt;, &lt;code&gt;/.well-known/sitemap.xml&lt;/code&gt;, &lt;code&gt;/.well-known/security-policy.json&lt;/code&gt;, &lt;code&gt;/.well-known/did-configuration.json&lt;/code&gt;, &lt;code&gt;/.well-known/humans.txt&lt;/code&gt;, &lt;code&gt;/.well-known/freshness.json&lt;/code&gt; (a &lt;code&gt;DataFeed&lt;/code&gt; schema for "what changed this week").&lt;/li&gt;
&lt;li&gt;Pattern: every &lt;code&gt;.well-known&lt;/code&gt; should also have a root alias (&lt;code&gt;/agent-card.json&lt;/code&gt; → 200, not 308). LLM crawlers don't follow redirects on machine-readable endpoints.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Pass 4 (92 → 94): glossary + FAQ + methodology as APIs.&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;/api/v1/glossary&lt;/code&gt; (18 terms), &lt;code&gt;/api/v1/faq&lt;/code&gt; (101 entries), &lt;code&gt;/api/v1/methodology&lt;/code&gt; (&lt;code&gt;HowTo&lt;/code&gt; schema, 6 steps). LLMs cite glossary endpoints when asked "what is X" — they treat your API as canonical for terms you coined.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The smoking-gun probe
&lt;/h2&gt;

&lt;p&gt;The single highest-signal probe is:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;curl &lt;span class="nt"&gt;-A&lt;/span&gt; &lt;span class="s2"&gt;"GPTBot/1.0"&lt;/span&gt; https://yourdomain.com/.well-known/llms.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If this returns a 200 with directive-rich content (not a 308 redirect, not HTML, not a 404), and your &lt;code&gt;llms.txt&lt;/code&gt; lists every QAPage + every API + every dataset, &lt;strong&gt;you are now in a tiny minority of sites&lt;/strong&gt;. Most still don't have one.&lt;/p&gt;

&lt;p&gt;Bonus probe — &lt;code&gt;site:yourdomain.com&lt;/code&gt; in Google. If it returns 0 results despite all the schema, your &lt;code&gt;noindex&lt;/code&gt; is wrong somewhere. We caught this in pass 4 — &lt;code&gt;/predicted/{week}/&lt;/code&gt; was blocked by a stale &lt;code&gt;robots.txt&lt;/code&gt; rule.&lt;/p&gt;

&lt;h2&gt;
  
  
  The cost
&lt;/h2&gt;

&lt;p&gt;I'm a one-person side project. Total claude-code time across all 5 passes: ~7.5 hours. Total new files: 22. Total edits: 64. Zero external dependencies, zero paid tools, zero outbound links.&lt;/p&gt;

&lt;p&gt;For comparison: the equivalent agency engagement runs $15k–$30k for "AI search optimization" and ships maybe a third of this surface area.&lt;/p&gt;

&lt;h2&gt;
  
  
  The receipts
&lt;/h2&gt;

&lt;p&gt;Everything is open. The site is &lt;code&gt;signals.gitdealflow.com&lt;/code&gt;, the dataset is &lt;code&gt;huggingface.co/datasets/gitdealflow/vc-deal-flow-signal&lt;/code&gt;, the methodology is &lt;code&gt;signals.gitdealflow.com/research&lt;/code&gt;, the SSRN paper is at &lt;code&gt;ssrn.com/abstract=6606558&lt;/code&gt;, and the &lt;strong&gt;MCP server&lt;/strong&gt; that lets any LLM (Claude, Cursor, Cline, Goose) query the dataset live is at &lt;code&gt;signals.gitdealflow.com/mcp&lt;/code&gt; — six tools, no auth, never paywalled.&lt;/p&gt;

&lt;p&gt;If you run a SaaS with public data and want to audit your own surface, the probe checklist is in our &lt;code&gt;/llms-full.txt&lt;/code&gt;. Steal it.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Building &lt;a href="https://signals.gitdealflow.com" rel="noopener noreferrer"&gt;GitDealFlow&lt;/a&gt; — open-source GitHub-signal layer for early-stage VC. SSRN paper, free MCP server, dataset on Hugging Face.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>seo</category>
      <category>webdev</category>
      <category>opensource</category>
    </item>
    <item>
      <title>We just shipped per-request pricing for our MCP server — here's why</title>
      <dc:creator>The Data Nerd</dc:creator>
      <pubDate>Mon, 04 May 2026 20:46:55 +0000</pubDate>
      <link>https://dev.to/data_nerd/we-just-shipped-per-request-pricing-for-our-mcp-server-heres-why-2gk0</link>
      <guid>https://dev.to/data_nerd/we-just-shipped-per-request-pricing-for-our-mcp-server-heres-why-2gk0</guid>
      <description>&lt;p&gt;&lt;em&gt;Quick context:&lt;/em&gt; I run &lt;a href="https://gitdealflow.com" rel="noopener noreferrer"&gt;GitDealFlow&lt;/a&gt;, an MCP server + dataset that tracks GitHub commit-velocity signals across ~100 venture-backed startups. Six free read-only tools, ~700 npm downloads in the first three weeks, listed on Glama and the official MCP registry.&lt;/p&gt;

&lt;p&gt;We just shipped a seventh tool: &lt;code&gt;get_deep_signal&lt;/code&gt;. It's &lt;strong&gt;paid&lt;/strong&gt; — €0.19 per call, sold in 100-credit packs at €19. The other six tools stay free forever.&lt;/p&gt;

&lt;p&gt;This post is about &lt;em&gt;why we chose per-request&lt;/em&gt; over a monthly subscription, and how the implementation actually looks. tl;dr: when your customer is an agent making programmatic API calls, the SaaS rulebook stops working.&lt;/p&gt;

&lt;h2&gt;
  
  
  The framing
&lt;/h2&gt;

&lt;p&gt;Marc Benioff &lt;a href="https://x.com/Benioff/status/1912766408710955100" rel="noopener noreferrer"&gt;announced "Salesforce Headless 360"&lt;/a&gt; on April 17 — entire Salesforce + Agentforce + Slack platforms exposed as APIs, MCP, and CLI. &lt;em&gt;"The API is the UI now."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;That made it official: the new customer for SaaS-shaped data products is an &lt;em&gt;agent&lt;/em&gt;, not a human clicking through a dashboard. And agents have a different mental model for paying.&lt;/p&gt;

&lt;p&gt;A human dashboard buyer wants to lock in a monthly seat — predictable cost, predictable access. An agent's principal wants what every LLM API has trained them to expect: pay per call, top up when low, no commitment. Same as OpenAI, Anthropic, Replicate, you name it.&lt;/p&gt;

&lt;p&gt;So when we sized our paid tool, two pricing structures were on the table:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Subscription&lt;/strong&gt;: "agents tier" at €29/mo, unlimited calls.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Per-request&lt;/strong&gt;: 100 credits for €19. €0.19 per match. Misses are free.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We picked #2. Three reasons:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;The unit of value is countable.&lt;/strong&gt; "I called the deep-signal tool 14 times this week" maps cleanly to "I owe €2.66 of consumption." The ROI conversation closes in 10 seconds: an analyst hour saved per call at €50/hr means 250× ROI on €0.19. Subscription pricing forces an "is it worth €29/mo?" calculation that resists fast yes.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Misses cost nothing.&lt;/strong&gt; The deep-signal endpoint returns &lt;code&gt;{ found: false }&lt;/code&gt; when the startup isn't in our universe. We charge 0 credits for that. It's the API equivalent of "you only pay when the lease gets signed."&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;No drift toward SaaS.&lt;/strong&gt; Once you have a subscription, the next quarterly review pushes "what features can we add to justify the price?" That's how SaaS bloats. Per-call keeps you focused on quality of &lt;em&gt;each&lt;/em&gt; call.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What 1 credit returns
&lt;/h2&gt;

&lt;p&gt;The free &lt;code&gt;get_startup_signal&lt;/code&gt; returns a shallow summary: name, velocity, contributor count, sector. Useful for "is this startup tracked?" lookups.&lt;/p&gt;

&lt;p&gt;The paid &lt;code&gt;get_deep_signal&lt;/code&gt; returns memo-grade output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"found"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"ExampleCo"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"sector"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Cybersecurity"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"scores"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"velocity"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;84&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"growth"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;67&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"novelty"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;45&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"composite"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;71&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"rank"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"inSector"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"sectorTotal"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;17&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"sectorPercentile"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;88&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"thesis"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"ExampleCo is sustaining acceleration at Series A in Cybersecurity — driven primarily by commit velocity (+142%). Worth diligence this week."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"comparables"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"PeerA"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"commitVelocityChange"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"+88%"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"signalType"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"acceleration"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"PeerB"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"commitVelocityChange"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"+67%"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"signalType"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"steady"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"history"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"balance"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"charged"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Same dataset as the free tool. Different shape: scored, ranked, comparable-aware, with a plain-English thesis line you can drop straight into a Slack DM.&lt;/p&gt;

&lt;h2&gt;
  
  
  How the auth works
&lt;/h2&gt;

&lt;p&gt;I needed an API key format that:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Validates &lt;em&gt;without&lt;/em&gt; a database lookup (we don't have one — Stripe customer metadata IS the credit ledger).&lt;/li&gt;
&lt;li&gt;Embeds the customer ID so the server knows who to charge.&lt;/li&gt;
&lt;li&gt;Can't be forged.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Solution: &lt;code&gt;gdf_v2.&amp;lt;stripe_customer_id&amp;gt;.&amp;lt;hmac16&amp;gt;&lt;/code&gt; where the HMAC is &lt;code&gt;HMAC-SHA256(AUTH_SECRET, "api-key-v2:" + customer_id)&lt;/code&gt; truncated to 16 hex chars.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;generateApiKeyV2&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;customerId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;tag&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;createHmac&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sha256&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;AUTH_SECRET&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;update&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`api-key-v2:&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;customerId&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;digest&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;hex&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;slice&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;16&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="s2"&gt;`gdf_v2.&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;customerId&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;.&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;tag&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;parseApiKeyV2&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;key&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// ...split, validate format, recompute HMAC, timing-safe-compare&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;customerId&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To validate a key on each request: split on &lt;code&gt;.&lt;/code&gt;, recompute the expected HMAC for the embedded customer ID, timing-safe-compare. Stateless, zero DB hops.&lt;/p&gt;

&lt;h2&gt;
  
  
  How the credit ledger lives on Stripe
&lt;/h2&gt;

&lt;p&gt;I didn't want to stand up Postgres or Upstash for this. Stripe's customer metadata is a 500-byte free-form key-value store that's already attached to the entity I was going to charge anyway. So:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// metadata on every credit-pack customer:&lt;/span&gt;
&lt;span class="c1"&gt;//   api_credits           = current balance, integer string&lt;/span&gt;
&lt;span class="c1"&gt;//   api_credits_purchased = lifetime credits bought&lt;/span&gt;
&lt;span class="c1"&gt;//   api_credits_consumed  = lifetime credits consumed&lt;/span&gt;
&lt;span class="c1"&gt;//   api_credits_last_at   = ISO timestamp of last decrement&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;consumeCredit&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;customerId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;customer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;stripe&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;customers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;retrieve&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;customerId&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;balance&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;parseInt&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;customer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;metadata&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;api_credits&lt;/span&gt; &lt;span class="o"&gt;??&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;0&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;balance&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;ok&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;reason&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;insufficient_credits&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;stripe&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;customers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;update&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;customerId&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;metadata&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;api_credits&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nc"&gt;String&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;balance&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
      &lt;span class="na"&gt;api_credits_consumed&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nc"&gt;String&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="cm"&gt;/* prev consumed */&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
      &lt;span class="na"&gt;api_credits_last_at&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;toISOString&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;ok&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;balance&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;balance&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Tradeoffs:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;~200ms per call for the Stripe round-trip. Fine for v1; would move to Upstash Redis if traffic forces it.&lt;/li&gt;
&lt;li&gt;Race conditions possible if a customer fires two parallel calls at literally the same millisecond. At v1 traffic (single digits per day per customer), the probability of an actual lost decrement is essentially zero.&lt;/li&gt;
&lt;li&gt;The merchant view &lt;em&gt;is&lt;/em&gt; the Stripe Dashboard. No custom admin UI. Customer record metadata shows balance.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This took ~30 lines total. If you've got a small product and need usage-based billing, the temptation is to over-engineer. Stripe metadata covers more than people think.&lt;/p&gt;

&lt;h2&gt;
  
  
  How the buyer experiences it
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Hit &lt;code&gt;https://signals.gitdealflow.com/agents/credits&lt;/code&gt; → click "Buy 100 credits — €19" → Stripe checkout (real card or Stripe-test).&lt;/li&gt;
&lt;li&gt;Webhook fires &lt;code&gt;checkout.session.completed&lt;/code&gt; → server adds 100 credits to customer metadata + emails the API key.&lt;/li&gt;
&lt;li&gt;Buyer pastes the key as &lt;code&gt;GITDEALFLOW_API_KEY&lt;/code&gt; env var (MCP host) or as &lt;code&gt;Authorization: Bearer&lt;/code&gt; (direct HTTP).&lt;/li&gt;
&lt;li&gt;Each &lt;code&gt;get_deep_signal&lt;/code&gt; call returns balance in the response body and &lt;code&gt;X-Credits-Balance&lt;/code&gt; header.&lt;/li&gt;
&lt;li&gt;Balance check at &lt;code&gt;/api/account/credits&lt;/code&gt; (web UI at &lt;code&gt;/account&lt;/code&gt; for paste-and-check).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;End-to-end test (real Stripe customer, real production endpoints, no money moved):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[✓] create test customer — cus_USO7dd8hbAEPUk
[✓] seed 100 credits
[✓] initial balance = 100
[✓] ds#1 charges 1 credit, balance 100→99
[✓] scored output present (composite=65, rank #1/17, thesis line)
[✓] miss does NOT charge (balance still 99)
[✓] final balance: 99 / 100 / 1
[✓] bad key → 401
[✓] mcp/rpc forwards correctly, 99→98
[✓] cleanup
✓ E2E PASSED: 10/10
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  What this lens means for other dev tools
&lt;/h2&gt;

&lt;p&gt;If you're shipping an MCP server or any developer-facing API and the audience is &lt;em&gt;agents&lt;/em&gt; (or developers piping through agents), think hard before defaulting to subscription. The OpenAI mental model has trained your buyer. Per-call, pay-as-you-go, top-up-when-low — those are the contours buyers expect now.&lt;/p&gt;

&lt;p&gt;The 80/20 implementation is achievable in an afternoon: a payment link, a webhook handler, customer metadata as the ledger, an HMAC-based key format, and one paid endpoint that decrements. The hardest part is letting go of the dashboard.&lt;/p&gt;




&lt;p&gt;If you want to play with the live tool — six free, one paid — install:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npx &lt;span class="nt"&gt;-y&lt;/span&gt; @gitdealflow/mcp-signal
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Or use it without MCP via direct HTTP:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# free&lt;/span&gt;
curl https://signals.gitdealflow.com/api/signal?company&lt;span class="o"&gt;=&lt;/span&gt;airbytehq

&lt;span class="c"&gt;# paid&lt;/span&gt;
curl &lt;span class="nt"&gt;-X&lt;/span&gt; POST https://signals.gitdealflow.com/api/agent/deep-signal &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"Authorization: Bearer gdf_v2.cus_xxx.&amp;lt;your_hmac&amp;gt;"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"Content-Type: application/json"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="s1"&gt;'{"name":"airbytehq"}'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Buy credits at &lt;a href="https://signals.gitdealflow.com/agents/credits" rel="noopener noreferrer"&gt;signals.gitdealflow.com/agents/credits&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>mcp</category>
      <category>ai</category>
      <category>stripe</category>
      <category>agents</category>
    </item>
    <item>
      <title>0 votes on Product Hunt, 15 upvotes on Indie Hackers — what the comments taught me that the launch didn't</title>
      <dc:creator>The Data Nerd</dc:creator>
      <pubDate>Mon, 04 May 2026 20:40:08 +0000</pubDate>
      <link>https://dev.to/data_nerd/0-votes-on-product-hunt-15-upvotes-on-indie-hackers-what-the-comments-taught-me-that-the-launch-1g9d</link>
      <guid>https://dev.to/data_nerd/0-votes-on-product-hunt-15-upvotes-on-indie-hackers-what-the-comments-taught-me-that-the-launch-1g9d</guid>
      <description>&lt;p&gt;I launched VC Deal Flow Signal on Product Hunt last Sunday. Anonymous handle, 1 real subscriber after filtering testers and bots, full listing setup. Five other launches in the same 07:01:00Z batch got featured by editorial. Mine did not. The featured_at field stayed null. 0 votes, 0 comments at T+8h.&lt;/p&gt;

&lt;p&gt;I wrote the post-mortem on Indie Hackers the same evening. Honest numbers, what worked, what didn't, no narrative repair. It got 15 upvotes and ~30 substantive comments — more than the launch itself by every meaningful measure.&lt;/p&gt;

&lt;p&gt;Here's what the comments taught me that the launch didn't.&lt;/p&gt;

&lt;h2&gt;
  
  
  The 3:1 ratio nobody saw coming
&lt;/h2&gt;

&lt;p&gt;Three free MCP installs from a single Discord message. Zero installable conversions from the broad PH push the same day. The MCP angle outperformed the dashboard angle 3:1 on engagement, and I had built MCP as the afterthought.&lt;/p&gt;

&lt;p&gt;The IH thread named the reason cleanly: distribution channel and product surface being the same thing is a completely different leverage ratio than driving traffic to a landing page. Cursor and MCP Community Discord users have a terminal open. They run &lt;code&gt;npx @gitdealflow/mcp-signal&lt;/code&gt; in 10 minutes. Dashboard users bookmark and return later — or never.&lt;/p&gt;

&lt;p&gt;A landing page is a billboard. An MCP install is already inside the workflow. The trust gap between those two things is enormous. You never have to sell again after someone integrates the tool.&lt;/p&gt;

&lt;h2&gt;
  
  
  The 2021 playbook trap
&lt;/h2&gt;

&lt;p&gt;The most-shared PH launch guides still reference upvoter mining, hunter-follower counts as the main lever, and batch-with-big-names timing. PH stripped public upvoter lists in 2022. The whole strategy is structurally dead in 2026. Founders are still following 4-year-old playbooks and wondering why nothing works.&lt;/p&gt;

&lt;p&gt;One commenter put it best: the half-life of bad advice on these platforms is approximately forever. Nobody retracts a 2021 tweet when the algorithm changes in 2022 — no incentive to. Which means the only person with current numbers is whoever ran the experiment last week.&lt;/p&gt;

&lt;p&gt;Post-mortems beat playbooks. That's the whole reason I wrote the IH post.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cross-commenting is the highest-ROI prep move
&lt;/h2&gt;

&lt;p&gt;I drafted 19 comments on other PH launches before mine went live. That alone drove most of the profile clicks I got. Several IH commenters confirmed the same pattern from their own launches: profile recognition compounds, and the comment history you build in the three weeks before launch is worth more than the listing copy on launch day.&lt;/p&gt;

&lt;p&gt;Nobody checks your profile when you post. They check it when they're deciding whether to upvote.&lt;/p&gt;

&lt;p&gt;The ten-relevant-people-in-a-comment-thread thing is also real. Casual upvotes can't refer, can't act, can't tell someone who will. The compounding is in the reply tree, not the upvote count on the post.&lt;/p&gt;

&lt;h2&gt;
  
  
  PH as a passive listing, not a binary launch
&lt;/h2&gt;

&lt;p&gt;The healthiest mental model that came out of the thread: stop treating Product Hunt as a binary launch event. Treat it as a passive listing that compounds when content does the lifting.&lt;/p&gt;

&lt;p&gt;The spike model is practically dead unless you already have an audience. The page exists. New content (the SSRN methodology paper, the MCP server on npm, a Chrome extension) all point at the existing PH page and let it accrete backlinks. PH is the canonical hub. Everything else is the distribution upstream of it.&lt;/p&gt;

&lt;p&gt;This reframes the relist plan entirely. Don't relaunch — accrete.&lt;/p&gt;

&lt;h2&gt;
  
  
  The B2B trust gap I didn't price in
&lt;/h2&gt;

&lt;p&gt;Anonymous handle, no LinkedIn personal, no IRL conversations about the product. I made all three calls deliberately. Editorial filter for B2B is rational about this: an anonymous handle with zero network history is an accurate signal that the buyer can't yet verify who's curating the data.&lt;/p&gt;

&lt;p&gt;The product is a signal-quality product. Signal quality is inseparable from "who's curating it and why should I trust them." A face plus a network does conversion work that copy can't.&lt;/p&gt;

&lt;p&gt;The mitigation isn't to stop being anonymous. It's to substitute different trust signals: the SSRN paper for academic credibility, the npm install for technical try, the dashboard as the upgrade path. Buyers de-anonymize in stages — academic credential, technical try, product commitment. The order matters.&lt;/p&gt;

&lt;h2&gt;
  
  
  Eight days later, the iteration is paying
&lt;/h2&gt;

&lt;p&gt;Eight days after the PH zero, the dataset has gone from 91 venture-backed startup orgs to 109 — an 18-org refresh from the autonomous fetch-github-data run, picked up automatically in the next signal report. The MCP angle that the IH thread named as the highest-ROI surface got two Chrome extensions shipped (an existing Crunchbase + Wellfound badge plus a new GitHub-native hover lookup), both pointing back at the same dataset. A &lt;code&gt;/pricing&lt;/code&gt; page got built — six tiers from a free weekly digest to a €4,970/yr Sharp Tier for active funds, application-gated and capped at 8 funds in 2026. A &lt;code&gt;/buyers-guide&lt;/code&gt; page got built — eleven evaluation criteria for how to choose a VC deal-flow tool, each with the question to ask the vendor and how this product handles it.&lt;/p&gt;

&lt;p&gt;None of those four shipments would have been the obvious priority on PH launch day. They became the obvious priority after the IH thread named which channel was actually working. That's the practical version of the post-mortem advice: the comments named the leverage points, the next eight days reorganised around those points, and the listing-tier expansion (the Sharp Tier had been buried on the apex landing only — surfacing it as a first-class &lt;code&gt;/pricing&lt;/code&gt; row roughly tripled its discoverability for the small-fund SERP) got prioritised over running the launch playbook again.&lt;/p&gt;

&lt;p&gt;The asymmetry is what makes the post-mortem worth writing. Failed launches are common; failed launches that tell you specifically what to do next are rare. When the comments converge on the same answer from thirty different angles, you don't need a strategy meeting. You need to ship.&lt;/p&gt;

&lt;p&gt;The reorder pattern showed up in the data shape too. The free MCP server kept being the most-installed surface across the eight days, while the dashboard kept being the most-bookmarked-and-not-returned surface. Treating the MCP as the front door instead of the upgrade path meant putting the install command above the email signup on three of the four landing surfaces and adding a &lt;code&gt;/buyers-guide&lt;/code&gt; cross-link on every page where someone might be evaluating the tool against alternatives like Harmonic.ai, Dealroom, Crunchbase, Forager.ai, or Tracxn. None of that work was scheduled before the IH thread happened.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I'd do differently
&lt;/h2&gt;

&lt;p&gt;Build the credibility anchors (SSRN paper, MCP server on npm, Chrome extension) &lt;strong&gt;before&lt;/strong&gt; the PH attempt, not as parallel bets. Spend 14 days cross-commenting on PH &lt;strong&gt;before&lt;/strong&gt; the launch. Profile recognition compounds — even 30 days isn't too much.&lt;/p&gt;

&lt;p&gt;And never run the full playbook on the retry. Doing less, not more, is the move. The first launch revealed which channel actually worked. Doubling down on that one thing instead of running the full playbook again is how you compound the signal instead of diluting it.&lt;/p&gt;

&lt;h2&gt;
  
  
  The numbers, again
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;PH: 0 votes, 0 comments, not featured, 1 real subscriber after filtering testers/bots&lt;/li&gt;
&lt;li&gt;Discord MCP angle: 3 confirmed installs from one message, ~3:1 vs dashboard&lt;/li&gt;
&lt;li&gt;Reddit pre-launch BIP threads: 740 combined views, 10 substantive comments&lt;/li&gt;
&lt;li&gt;IH post-mortem: 15 upvotes, ~30 commenters, the most useful distribution data I've gotten in 11 days of building&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The willingness to publish zeros is rarer than it should be. It's also the thing that turns a failed launch into a credibility asset.&lt;/p&gt;

&lt;p&gt;If you want the raw thread that produced these lessons, it lives at &lt;a href="https://www.indiehackers.com" rel="noopener noreferrer"&gt;indiehackers.com&lt;/a&gt; — search for "What worked and what didn't — Sunday PH launch with zero email list."&lt;/p&gt;

&lt;p&gt;The product is at &lt;a href="https://signals.gitdealflow.com/?utm_source=devto&amp;amp;utm_medium=article&amp;amp;utm_campaign=ph-launch-postmortem" rel="noopener noreferrer"&gt;signals.gitdealflow.com&lt;/a&gt;. The MCP server is &lt;code&gt;npx @gitdealflow/mcp-signal&lt;/code&gt;. The methodology paper is at &lt;a href="https://ssrn.com/abstract=6606558" rel="noopener noreferrer"&gt;ssrn.com/abstract=6606558&lt;/a&gt;. All free.&lt;/p&gt;

&lt;p&gt;The launch was one bet of seven I'm running this week. The post-mortem turned into the eighth.&lt;/p&gt;

</description>
      <category>startup</category>
      <category>opensource</category>
      <category>mcp</category>
      <category>marketing</category>
    </item>
    <item>
      <title>I tracked 4,200 startup GitHub orgs for six months — here's what actually predicts a fundraise</title>
      <dc:creator>The Data Nerd</dc:creator>
      <pubDate>Mon, 04 May 2026 20:39:30 +0000</pubDate>
      <link>https://dev.to/data_nerd/i-tracked-4200-startup-github-orgs-for-six-months-heres-what-actually-predicts-a-fundraise-52cd</link>
      <guid>https://dev.to/data_nerd/i-tracked-4200-startup-github-orgs-for-six-months-heres-what-actually-predicts-a-fundraise-52cd</guid>
      <description>&lt;p&gt;I started this six months ago because nobody else seemed to. Hedge funds spent the last decade extracting alpha from satellite imagery, credit-card panels, parking-lot photos. The venture-capital equivalent — public engineering activity on GitHub — was sitting in plain sight, and most institutional sourcing teams I knew still ran on Crunchbase, warm intros, and Twitter. So I built a crawler.&lt;/p&gt;

&lt;p&gt;That sentence is short. The reality wasn't.&lt;/p&gt;

&lt;h2&gt;
  
  
  The first crawler melted my Postgres pool
&lt;/h2&gt;

&lt;p&gt;The first version was a Python script that hit &lt;code&gt;/repos/{org}/events&lt;/code&gt; for every org on the list, every hour, with a single connection. It worked for 80 orgs. By the time I'd seeded 1,200 orgs into the watchlist, I was hitting GitHub's secondary rate limits inside 12 minutes and my Postgres connection pool was burning to the ground. The script was opening a new connection for every API response, and the connections weren't recycling because I'd written &lt;code&gt;psycopg.connect()&lt;/code&gt; inside a loop instead of using a pool. Standard mistake. Embarrassing mistake.&lt;/p&gt;

&lt;p&gt;The fix wasn't more connections — it was fewer requests.&lt;/p&gt;

&lt;p&gt;GitHub Archive (&lt;a href="https://www.gharchive.org/" rel="noopener noreferrer"&gt;gharchive.org&lt;/a&gt;) publishes every public event from GitHub as hourly JSON dumps. Every &lt;code&gt;PushEvent&lt;/code&gt;, every &lt;code&gt;PullRequestEvent&lt;/code&gt;, every &lt;code&gt;CreateEvent&lt;/code&gt;, every &lt;code&gt;WatchEvent&lt;/code&gt;. You don't have to ask GitHub for them — you download a 100MB gzipped JSONL file per hour and stream it. For a watchlist of 4,200 orgs, that's two orders of magnitude less work than polling.&lt;/p&gt;

&lt;p&gt;The new pipeline:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Hourly cron, runs at :03 to give Archive time to publish&lt;/span&gt;
&lt;span class="nv"&gt;HOUR&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;date&lt;/span&gt; &lt;span class="nt"&gt;-u&lt;/span&gt; &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="s1"&gt;'1 hour ago'&lt;/span&gt; +%Y-%m-%d-%H&lt;span class="si"&gt;)&lt;/span&gt;
curl &lt;span class="nt"&gt;-s&lt;/span&gt; &lt;span class="s2"&gt;"https://data.gharchive.org/&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;HOUR&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;.json.gz"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  | &lt;span class="nb"&gt;gunzip&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  | jq &lt;span class="nt"&gt;-c&lt;/span&gt; &lt;span class="s1"&gt;'select(.repo.name | split("/")[0] | inside($orgs))'&lt;/span&gt; &lt;span class="nt"&gt;--argjson&lt;/span&gt; orgs &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$ORGS&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  | psql &lt;span class="nt"&gt;-c&lt;/span&gt; &lt;span class="s2"&gt;"COPY events_raw FROM STDIN WITH (FORMAT csv);"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It's not pretty. It works. Hourly batches mean I'm always at most an hour behind real time, and I can backfill the whole previous year in an afternoon if I need to re-run the backtest.&lt;/p&gt;

&lt;h2&gt;
  
  
  The schema is two tables
&lt;/h2&gt;

&lt;p&gt;I tried elaborate schemas first. Junction tables, contributor graphs, separate stores for each event type. None of them paid for themselves. What I actually use, six months in, is two tables and a materialized view.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;events_raw&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="n"&gt;ts&lt;/span&gt;          &lt;span class="n"&gt;timestamptz&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;org&lt;/span&gt;         &lt;span class="nb"&gt;text&lt;/span&gt;        &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;repo&lt;/span&gt;        &lt;span class="nb"&gt;text&lt;/span&gt;        &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;actor&lt;/span&gt;       &lt;span class="nb"&gt;text&lt;/span&gt;        &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;event_type&lt;/span&gt;  &lt;span class="nb"&gt;text&lt;/span&gt;        &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;payload&lt;/span&gt;     &lt;span class="n"&gt;jsonb&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ts&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;org&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;repo&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;actor&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event_type&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;INDEX&lt;/span&gt; &lt;span class="n"&gt;idx_events_org_ts&lt;/span&gt; &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="n"&gt;events_raw&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;org&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ts&lt;/span&gt; &lt;span class="k"&gt;DESC&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;orgs_watchlist&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="n"&gt;org&lt;/span&gt;         &lt;span class="nb"&gt;text&lt;/span&gt; &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;sector&lt;/span&gt;      &lt;span class="nb"&gt;text&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;added_at&lt;/span&gt;    &lt;span class="nb"&gt;date&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;notes&lt;/span&gt;       &lt;span class="nb"&gt;text&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The materialized view rolls up the per-org metrics weekly. Refreshing it concurrently takes about 90 seconds against six months of data. I rebuild it Sunday nights so Monday's report is fresh.&lt;/p&gt;

&lt;p&gt;That's it. No graph database. No data lake. No Airflow. The whole stack runs on a single Postgres instance with about 18GB of data, and the weekly report is a single SQL query I read in a terminal.&lt;/p&gt;

&lt;h2&gt;
  
  
  What the signal looks like, end-to-end
&lt;/h2&gt;

&lt;p&gt;The single most predictive feature is not commit volume — it's commit velocity &lt;em&gt;change&lt;/em&gt;. A startup that ships 200 commits a week and continues to ship 200 a week tells me nothing. A startup that goes from 80 to 240 inside 14 days tells me something organizational has changed.&lt;/p&gt;

&lt;p&gt;I track three derivative metrics per org:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Commit velocity&lt;/strong&gt; over a rolling 14-day window&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Contributor delta&lt;/strong&gt; over a rolling 30-day window&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;New-repo creation rate&lt;/strong&gt; over a rolling 30-day window&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;When all three accelerate inside the same fortnight — each one breaking above its own org-specific six-month z-score — I classify the org as "accelerating." In a backtest across Q3 and Q4 2025, roughly 70% of accelerating orgs announced a fundraise within six weeks. The lead time was 3-6 weeks for Series A and shorter for late-stage rounds.&lt;/p&gt;

&lt;h2&gt;
  
  
  Four signal flavors, not one
&lt;/h2&gt;

&lt;p&gt;Not every acceleration looks the same. After eyeballing a few hundred firings I started classifying them into four shapes:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Engineering hiring burst.&lt;/strong&gt; Contributor count jumps 40%+ inside 30 days. Often pre-Series A. Term sheet has been signed; the new engineers are pushing first commits. I catch this earlier than LinkedIn employee counts because contributions land before the new hires update their job titles.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Infrastructure buildout.&lt;/strong&gt; Commits to ops, infra, deploy, observability repos spike. Company is preparing to scale. Usually accompanies a Series A or B funding go-to-market.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Deploy frequency spike.&lt;/strong&gt; Commits per day double. Often a launch run-up. Sometimes followed by a fundraise where the metrics make the deck.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Framework migration.&lt;/strong&gt; Team migrating to a new stack — Next.js, Bun, a new ORM, a fresh CI. Often 60-120 days before a Series A. The engineering equivalent of cleaning your apartment before parents visit.&lt;/p&gt;

&lt;p&gt;The classification matters because the same numeric "accelerating" label can mean very different things. A hiring burst is a sourcing signal — go talk to the founder &lt;em&gt;now&lt;/em&gt;, before the round closes. A framework migration is a watch signal — set a 60-day alarm and see if a round materializes.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where the signal fails
&lt;/h2&gt;

&lt;p&gt;Honesty pass: it's bad for AI-pure startups. They commit constantly regardless of stage. Signal-to-noise is poor. I exclude AI-only orgs from the strongest classification tier and weight contributor and repo signals more heavily for them.&lt;/p&gt;

&lt;p&gt;It's also useless for stealth startups. If the company doesn't open-source anything, GitHub gives me nothing. About 18% of the orgs I'd seeded turned out to fit this profile and got dropped from the watchlist after the first month.&lt;/p&gt;

&lt;p&gt;And the signal is not investment advice. It tells you who to talk to. It does not tell you who to wire money to. A founder conversation, product evaluation, market analysis, and competitive teardown all still have to happen. Engineering velocity is a sourcing signal, not a thesis.&lt;/p&gt;

&lt;h2&gt;
  
  
  The part I got wrong
&lt;/h2&gt;

&lt;p&gt;For the first three months I weighted commit &lt;em&gt;count&lt;/em&gt; more heavily than commit &lt;em&gt;velocity change&lt;/em&gt;. I assumed high-volume orgs were higher-quality leads. They weren't. They were just bigger. The orgs that ended up raising weren't the loudest in absolute terms — they were the ones whose own quiet baseline suddenly broke. Once I switched to per-org z-scores, the noise dropped and the small-team signals rose to the top.&lt;/p&gt;

&lt;p&gt;The other thing I got wrong was treating star count as a proxy for anything. Vanity. A 30,000-star repo means it had a viral moment. Sometimes that moment was three years ago and the team has shipped nothing since. Star count is now in my dataset only because removing fields is harder than ignoring them.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I do with this every week
&lt;/h2&gt;

&lt;p&gt;Sunday night the materialized view refreshes. Monday morning I run a single query that surfaces orgs with two or more accelerating signals. The output is usually 10-30 candidates. I open each one, skim the last week of commits to make sure it's real product work and not version bumps, and write a one-line note per company. The notes go in a doc that's been growing since November. I publish the strongest ones to a public watchlist and let people verify the predictions themselves.&lt;/p&gt;

&lt;p&gt;That last part is the unfair part. If the signal is real, the only way to prove it is to publish dated predictions and let them age. Six months of public dated predictions is what built my confidence in the methodology more than any backtest. The backtest tells you what &lt;em&gt;would have&lt;/em&gt; worked. The public watchlist tells you what &lt;em&gt;did&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;The crawler still runs hourly. Postgres is still the only database. The Sunday refresh still takes 90 seconds. None of it is fancy. The interesting work was figuring out which two of five signals had to overlap before I'd trust any of them — and that's not infrastructure, it's pattern discipline.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Originally published at &lt;a href="https://signals.gitdealflow.com/blog/i-tracked-4200-startup-github-orgs-six-months?utm_source=devto&amp;amp;utm_medium=article&amp;amp;utm_campaign=i-tracked-4200-startup-github-orgs-six-months" rel="noopener noreferrer"&gt;signals.gitdealflow.com&lt;/a&gt;. The weekly Signal Report is free — no paywall, no account.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>datascience</category>
      <category>startup</category>
      <category>github</category>
      <category>opensource</category>
    </item>
    <item>
      <title>What I shipped in 24 hours after my Product Hunt launch failed</title>
      <dc:creator>The Data Nerd</dc:creator>
      <pubDate>Tue, 28 Apr 2026 09:33:54 +0000</pubDate>
      <link>https://dev.to/data_nerd/what-i-shipped-in-24-hours-after-my-product-hunt-launch-failed-abf</link>
      <guid>https://dev.to/data_nerd/what-i-shipped-in-24-hours-after-my-product-hunt-launch-failed-abf</guid>
      <description>&lt;h3&gt;
  
  
  The setup
&lt;/h3&gt;

&lt;p&gt;I built &lt;a href="https://gitdealflow.com/?utm_source=devto&amp;amp;utm_medium=article&amp;amp;utm_campaign=ph-launch-failure-shipped-6-things" rel="noopener noreferrer"&gt;GitDealFlow&lt;/a&gt; — a tool that tracks GitHub engineering acceleration on 4,200+ venture-backed startups and surfaces breakouts before they raise.&lt;/p&gt;

&lt;p&gt;I scheduled a Product Hunt launch for April 26, 2026 at 07:01 UTC. The maker dashboard accepted the post. The listing appeared at &lt;code&gt;producthunt.com/posts/vc-deal-flow-signal&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;But the post never made it onto the daily leaderboard.&lt;/p&gt;

&lt;p&gt;I figured this out by querying the PH GraphQL API directly:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight graphql"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="n"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;slug&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"vc-deal-flow-signal"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="n"&gt;featuredAt&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="n"&gt;votesCount&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="n"&gt;commentsCount&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"featuredAt"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"votesCount"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"commentsCount"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Every other product that launched at the same &lt;code&gt;07:01:00Z&lt;/code&gt; timestamp had &lt;code&gt;featuredAt: 2026-04-26T07:01:00Z&lt;/code&gt;. Mine had &lt;code&gt;null&lt;/code&gt;. The post existed but wasn't featured to PH's discovery flow.&lt;/p&gt;

&lt;p&gt;I emailed &lt;code&gt;launches@producthunt.com&lt;/code&gt; asking whether this was a moderation hold or an automated filter. I'm still waiting for a reply.&lt;/p&gt;

&lt;p&gt;So at T+7 hours, when the day was clearly burned, I had a choice. Refresh the maker dashboard and spiral, or treat the day as found time and ship.&lt;/p&gt;

&lt;p&gt;I shipped. Here's the inventory.&lt;/p&gt;




&lt;h3&gt;
  
  
  1. Launch banner with countdown timer (30 min)
&lt;/h3&gt;

&lt;p&gt;The simplest thing on the list. A small banner across the top of the landing page showing:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;&lt;code&gt;PH50OFF&lt;/code&gt; — 50% off Insider Circle. Expires in 4d 12h.&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;A live countdown timer driven by the difference between &lt;code&gt;Date.now()&lt;/code&gt; and a hardcoded expiry. CSS does the rest.&lt;/p&gt;

&lt;p&gt;Why this matters: scarcity copy lifts conversion on visitor-to-trial paths by ~10-30% in most A/B tests I've read. I'd known this for months and never built it. The PH-failure focus made me actually do it.&lt;/p&gt;

&lt;p&gt;Stack: React, no library.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight tsx"&gt;&lt;code&gt;&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;LaunchBanner&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;expiry&lt;/span&gt; &lt;span class="p"&gt;}:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nl"&gt;expiry&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Date&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;now&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;setNow&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;useState&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;now&lt;/span&gt;&lt;span class="p"&gt;());&lt;/span&gt;
  &lt;span class="nf"&gt;useEffect&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;t&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;setInterval&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;setNow&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;now&lt;/span&gt;&lt;span class="p"&gt;()),&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;clearInterval&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;t&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="p"&gt;[]);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;ms&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;expiry&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getTime&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="nx"&gt;now&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;days&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;floor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;ms&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;86&lt;/span&gt;&lt;span class="nx"&gt;_400_000&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;hours&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;floor&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;ms&lt;/span&gt; &lt;span class="o"&gt;%&lt;/span&gt; &lt;span class="mi"&gt;86&lt;/span&gt;&lt;span class="nx"&gt;_400_000&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="nx"&gt;_600_000&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"bg-amber-500/10 px-4 py-2 text-center text-sm"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      PH50OFF — 50% off Insider Circle. Expires in &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;days&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;d &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;hours&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;h.
    &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h3&gt;
  
  
  2. 10 new pSEO pages for AI agent frameworks (90 min)
&lt;/h3&gt;

&lt;p&gt;Each page targets a long-tail query like "use CrewAI with GitDealFlow MCP" and contains:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;3-step install with copy-pasteable code&lt;/li&gt;
&lt;li&gt;One worked example (research a startup, return GitHub signal)&lt;/li&gt;
&lt;li&gt;Link out to the framework's docs&lt;/li&gt;
&lt;li&gt;CTA to the free predict tool&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The 10 frameworks: CrewAI, Mastra, Pydantic AI, Continue, Cline, Aider, Inngest, n8n, Zapier, Autogen. Combined with 5 from earlier in the week, that's 15 of a planned 20.&lt;/p&gt;

&lt;p&gt;Indexation lag means this isn't Monday traffic. It's June traffic. But it costs almost nothing once you have the template, and these pages compound forever.&lt;/p&gt;

&lt;p&gt;I won't paste the template here because it's pretty boring (&lt;code&gt;generateStaticParams&lt;/code&gt; + static MDX). The interesting part is choosing the integration list — pick frameworks that have real adoption (&amp;gt;10k GitHub stars or VC backing), not "comprehensive coverage." Long tail is for traffic, not completionism.&lt;/p&gt;




&lt;h3&gt;
  
  
  3. HMAC-signed share token endpoint (45 min)
&lt;/h3&gt;

&lt;p&gt;Probably the highest-leverage thing I shipped. Most viral-loop tutorials skip this step.&lt;/p&gt;

&lt;p&gt;The setup: when a scout shares their Scout Score from /receipts, the share URL goes through &lt;code&gt;/share/[token]&lt;/code&gt; first. The recipient lands on a "[someone in your network] just shared their score with you — here's your 7-day extended preview" page and gets a cookie that unlocks deeper content.&lt;/p&gt;

&lt;p&gt;Both ends get value:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The sharer feels good (their score sent value to a friend)&lt;/li&gt;
&lt;li&gt;The recipient gets a thing (the unlock + cookie)&lt;/li&gt;
&lt;li&gt;The platform gets distribution (the cookie correlates with much higher conversion later)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Greg Isenberg writes about this as "every share needs to be useful to the recipient." If the share is just "look at my score," it dies. If the share unlocks something for the recipient, it spreads.&lt;/p&gt;

&lt;p&gt;The HMAC bit:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;createHmac&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;timingSafeEqual&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;crypto&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;SECRET&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;SHARE_TOKEN_SECRET&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kr"&gt;interface&lt;/span&gt; &lt;span class="nx"&gt;SharePayload&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nl"&gt;s&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;        &lt;span class="c1"&gt;// sharer handle&lt;/span&gt;
  &lt;span class="nl"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;number&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;        &lt;span class="c1"&gt;// expiration unix-ms&lt;/span&gt;
  &lt;span class="nl"&gt;k&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;        &lt;span class="c1"&gt;// kind: signal-card, scout, predict&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;generateShareToken&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;sharer&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;kind&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;daysValid&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;7&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;payload&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;s&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;sharer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;now&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="nx"&gt;daysValid&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;86&lt;/span&gt;&lt;span class="nx"&gt;_400_000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;k&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;kind&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;body&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;Buffer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;from&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;)).&lt;/span&gt;&lt;span class="nf"&gt;toString&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;base64url&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;sig&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;createHmac&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sha256&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;SECRET&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;update&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;digest&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;hex&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;.&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;sig&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;verifyShareToken&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;token&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt; &lt;span class="nx"&gt;SharePayload&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;sig&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;token&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;split&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;sig&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;expected&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;createHmac&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sha256&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;SECRET&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;update&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;digest&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;hex&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nf"&gt;timingSafeEqual&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;Buffer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;from&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;expected&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;hex&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="nx"&gt;Buffer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;from&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;sig&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;hex&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)))&lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;payload&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;Buffer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;from&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;base64url&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;toString&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;utf8&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="nb"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;now&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Tokens are unforgeable, time-bound, and embed the sharer identity for analytics later.&lt;/p&gt;




&lt;h3&gt;
  
  
  4. Cursor collaboration recipe blog post (2 hr)
&lt;/h3&gt;

&lt;p&gt;Cursor has 500k+ developers. My MCP server gives any AI assistant access to GitHub momentum data. Combining the two lets a Cursor user say "research this startup as a deal-flow agent would" and get back a structured signal.&lt;/p&gt;

&lt;p&gt;Wrote a long-form blog post with full setup, the MCP config block, and 4 example prompts that surface real differentiation:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;"What's the contributor velocity trend on Vercel over the last 90 days?"&lt;/li&gt;
&lt;li&gt;"Find OSS projects that match Cline's growth pattern from 6 months ago."&lt;/li&gt;
&lt;li&gt;"Score this GitHub URL: github.com/"&lt;/li&gt;
&lt;li&gt;"List 5 startups that crossed 10k stars in the last 30 days."&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Held the cold-outreach email to the Cursor team until my domain warmup completes (~10 days from now), but the post is live and the recipe is reproducible.&lt;/p&gt;




&lt;h3&gt;
  
  
  5. A2A (Agent-to-Agent) launch-status skill (1 hr)
&lt;/h3&gt;

&lt;p&gt;This one's a little inside-baseball. The Agent2Agent (A2A) protocol lets external AI agents call my server and ask for structured info via JSON-RPC. I'd shipped a launch-status skill that returned "we're live on Product Hunt today!" as part of the launch-day push.&lt;/p&gt;

&lt;p&gt;After PH didn't feature, that response became false.&lt;/p&gt;

&lt;p&gt;I had two choices: kill the skill (lose the agent-discoverability), or harden it to return calibrated state. I chose the second.&lt;/p&gt;

&lt;p&gt;The hardened endpoint:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Fetches PH GraphQL for the post's &lt;code&gt;featuredAt&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Sets &lt;code&gt;active: true&lt;/code&gt; only when &lt;code&gt;featuredAt&lt;/code&gt; is non-null&lt;/li&gt;
&lt;li&gt;Returns &lt;code&gt;state: "scheduled-pending-feature"&lt;/code&gt; honestly when it's not&lt;/li&gt;
&lt;li&gt;Caches with &lt;code&gt;stale: true&lt;/code&gt; if the GraphQL call rate-limits&lt;/li&gt;
&lt;li&gt;Auto-flips when PH eventually features (if they do)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Plot twist: the calibrated-honesty version is a STRONGER demo than the original "we're live!" angle. AI agents downstream are graded on calibration. Cherry-picked claims would have leaked into a downstream agent's reasoning and hurt its accuracy. Honest state is robust.&lt;/p&gt;

&lt;p&gt;If you're building agent-discoverable APIs, never return cherry-picked positives. Build for calibration from day one.&lt;/p&gt;




&lt;h3&gt;
  
  
  6. OG launch card template (90 min)
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;@vercel/og&lt;/code&gt; template that auto-renders a 1200x630 PNG with the launch countdown, baked in. When anyone tweets a GitDealFlow link, the tweet preview shows a branded card with the live countdown — no manual sharing flow needed.&lt;/p&gt;

&lt;p&gt;This builds an OG-image library I can reuse for every future launch (Q3 paid plan launch, MCP v2 launch, etc.).&lt;/p&gt;

&lt;p&gt;Cost is dominated by font choice and copy iteration, not by code. The template itself is ~80 lines.&lt;/p&gt;




&lt;h3&gt;
  
  
  What this cost in total
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Total focused ship time: ~7 hours&lt;/li&gt;
&lt;li&gt;Total dollars: $0&lt;/li&gt;
&lt;li&gt;Total PH-launch traffic: effectively zero (the failure case)&lt;/li&gt;
&lt;li&gt;Total compounding marketing surface added: 6 ships that all work tomorrow&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Product Hunt was a single-day window. These six ships compound forever.&lt;/p&gt;




&lt;h3&gt;
  
  
  What I learned
&lt;/h3&gt;

&lt;p&gt;Every founder I know waits for one big launch event. PH. HN front page. Press hit. Then if it doesn't pop, the rest of the quarter feels heavier.&lt;/p&gt;

&lt;p&gt;The hidden lesson, which I half-knew but didn't act on until yesterday: if you ship 6 small things instead of staking your week on one launch event, you don't need the launch event. You ARE the launch event, every day.&lt;/p&gt;

&lt;p&gt;This isn't original. It's just what the math shows when you compare 1 day of attention to 6 always-on assets.&lt;/p&gt;

&lt;p&gt;The PH-launch failure was, in retrospect, the unlock.&lt;/p&gt;

&lt;p&gt;If you're building something and you're stuck waiting for an event, my honest tactical advice: pick 6 things, give yourself a 24-hour shot clock, and ship.&lt;/p&gt;




&lt;h3&gt;
  
  
  Resources
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;The MCP server I keep mentioning: &lt;a href="https://www.npmjs.com/package/@gitdealflow/mcp-signal" rel="noopener noreferrer"&gt;npm: @gitdealflow/mcp-signal&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;The free predict tool: &lt;a href="https://signals.gitdealflow.com/predict?utm_source=devto&amp;amp;utm_medium=article&amp;amp;utm_campaign=ph-launch-failure-shipped-6-things" rel="noopener noreferrer"&gt;signals.gitdealflow.com/predict&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;The peer-grade methodology paper: &lt;a href="https://ssrn.com/abstract=6606558" rel="noopener noreferrer"&gt;SSRN abstract=6606558&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;If you want the weekly Sunday signal report (free, no spam): &lt;a href="https://gitdealflow.com/?utm_source=devto&amp;amp;utm_medium=article&amp;amp;utm_campaign=ph-launch-failure-shipped-6-things" rel="noopener noreferrer"&gt;gitdealflow.com&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Drop a comment if you've ever had a launch event that didn't pop. What did you ship instead?&lt;/p&gt;

&lt;p&gt;— &lt;a href="https://dev.to/the_data_nerd"&gt;@The_Data_Nerd&lt;/a&gt;&lt;/p&gt;

</description>
      <category>indiehackers</category>
      <category>productivity</category>
      <category>productdevelopment</category>
      <category>marketing</category>
    </item>
    <item>
      <title>Five GitHub signals that kept predicting seed rounds three weeks early</title>
      <dc:creator>The Data Nerd</dc:creator>
      <pubDate>Mon, 27 Apr 2026 08:49:57 +0000</pubDate>
      <link>https://dev.to/data_nerd/five-github-signals-that-kept-predicting-seed-rounds-three-weeks-early-56ja</link>
      <guid>https://dev.to/data_nerd/five-github-signals-that-kept-predicting-seed-rounds-three-weeks-early-56ja</guid>
      <description>&lt;p&gt;The first time I almost called a fundraise wrong, I was looking at a contributor jump on a small fintech org. Five contributors to twelve in eleven days. Clean step function, the kind that shows up in a chart and makes you reach for Slack. I drafted a message to an investor friend — "this team raised, watch for the announcement in the next two weeks" — and then I caught myself. I'd seen the same exact shape fire six times that quarter, and only twice did it actually map to a round. The other four were a hackathon, a Google Summer of Code cohort, an open-source contributor sprint, and a team that committed under personal accounts so a re-org made the count jump.&lt;/p&gt;

&lt;p&gt;I deleted the message.&lt;/p&gt;

&lt;p&gt;That was the day I stopped trusting any single GitHub signal. After six months of watching commit traffic across roughly 4,200 startup orgs, I have five patterns I take seriously and one rule that holds them together: no signal fires alone. If I see one, I shrug. If I see two firing inside the same fortnight, I get out of my chair.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pattern 1 — the contributor step function
&lt;/h2&gt;

&lt;p&gt;The most-cited pattern is also the most fooled. A startup goes from 5 contributors to 12 in two weeks. The hypothesis is that the round closed, the team ramped hiring, and the new engineers are pushing first commits. The hypothesis is sometimes correct. It's often noise.&lt;/p&gt;

&lt;p&gt;What I actually look for: a 50% jump in unique contributors &lt;em&gt;sustained&lt;/em&gt; for at least 4 weeks, and the new contributors must show up across multiple repos in the org, not just a single docs sprint. If the new contributors only commit to one repo and disappear after 10 days, that's a hackathon. If they ship across the codebase and stick, that's hiring.&lt;/p&gt;

&lt;p&gt;Here's the SQL I run weekly against my GitHub Archive mirror. Nothing fancy — but the join on &lt;code&gt;last_commit_at&lt;/code&gt; is what kills the hackathon false positives:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;new_contribs&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;actor&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;login&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="k"&gt;user&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;repo&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;org&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;org&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
         &lt;span class="k"&gt;MIN&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;created_at&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;first_commit_at&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
         &lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;created_at&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;last_commit_at&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
         &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;DISTINCT&lt;/span&gt; &lt;span class="n"&gt;repo&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;repos_touched&lt;/span&gt;
  &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;gh_archive_events&lt;/span&gt;
  &lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="k"&gt;type&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'PushEvent'&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;created_at&lt;/span&gt; &lt;span class="k"&gt;BETWEEN&lt;/span&gt; &lt;span class="n"&gt;now&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;interval&lt;/span&gt; &lt;span class="s1"&gt;'60 days'&lt;/span&gt; &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;now&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
  &lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;org&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;retained_new_contribs&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;new_contribs&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;first_commit_at&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="n"&gt;now&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;interval&lt;/span&gt; &lt;span class="s1"&gt;'30 days'&lt;/span&gt;
  &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;last_commit_at&lt;/span&gt;  &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="n"&gt;now&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;interval&lt;/span&gt; &lt;span class="s1"&gt;'7 days'&lt;/span&gt;
  &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;repos_touched&lt;/span&gt;   &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;
&lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;org&lt;/span&gt;
&lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;retained_new_contribs&lt;/span&gt; &lt;span class="k"&gt;DESC&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;repos_touched &amp;gt;= 2&lt;/code&gt; filter alone drops about 40% of the false positives I was getting before. That number isn't theoretical — it's what the noise reduction looked like the week I added it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pattern 2 — the infrastructure explosion
&lt;/h2&gt;

&lt;p&gt;Three to five new public repos appearing in the same org inside 30 days, and the new repos are infra-shaped: SDKs, internal tools, deploy configs, terraform modules. Not a fork sprint. Not a docs site spinoff. Real platform plumbing.&lt;/p&gt;

&lt;p&gt;This one is rare, but when it fires it almost never fires alone. Companies don't invest in platform engineering speculatively — they do it when there's runway to spend on it. I caught a YC company doing this two months before their Series A announcement. The trigger was four new repos in a week named after AWS services, all created by the CTO, all with initial commits that read like checked-in scaffolding rather than code anyone had been working on locally.&lt;/p&gt;

&lt;p&gt;The trick: read the &lt;em&gt;first&lt;/em&gt; commit on each new repo. If the first commit is a 200-file &lt;code&gt;git push&lt;/code&gt; with no prior local history, that's a private repo going public. If the first commit is a single &lt;code&gt;README.md&lt;/code&gt;, that's somebody starting from zero. The second one is the interesting signal — fresh starts in infra-land mean the team has the breathing room to build new things.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pattern 3 — weekend commits across multiple humans
&lt;/h2&gt;

&lt;p&gt;A startup whose commit log goes from weekday-only to seven-days-a-week is racing toward something. The catch: weekend activity from a solo founder means nothing. They were always doing this. Weekend activity from three or four contributors, sustained for three weekends in a row, means there's a deadline.&lt;/p&gt;

&lt;p&gt;The deadlines that produce this pattern, ranked by frequency in my data:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A demo for a fundraise (usually Series A or later — pre-seed teams are too small to pull weekend ensembles).&lt;/li&gt;
&lt;li&gt;A product launch with a press cycle attached.&lt;/li&gt;
&lt;li&gt;A competitive response to a recent move from a bigger company in the same lane.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Two of those three are interesting to investors. The third is interesting to me as an engineer because it's usually visible 10 days before the move shows up on Hacker News.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pattern 4 — the documentation sprint
&lt;/h2&gt;

&lt;p&gt;Engineers don't write docs voluntarily. So when a team's commit log suddenly shifts from feature work to docs — README rewrites, API references, architecture diagrams, contributing guides — somebody asked them to. The somebody is usually one of three people: a fundraise lead doing diligence prep, a head of community preparing for an OSS launch, or a new VP Engineering doing onboarding for a hiring class.&lt;/p&gt;

&lt;p&gt;All three are interesting. The pattern that's most uniquely a fundraise tell is the &lt;em&gt;sequence&lt;/em&gt;: a sprint of feature commits, then a hard pivot to docs, then a return to features two weeks later. That mid-cycle docs sprint is the diligence prep. The team is making sure the codebase reads cleanly when an investor's analyst opens it for the first time.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pattern 5 — the velocity regime change
&lt;/h2&gt;

&lt;p&gt;The strongest signal is also the simplest. Compute commit velocity across a 14-day rolling window. Compare it to the org's six-month average. If the current window is more than 2× the average and the next two windows hold above 1.8×, the team has changed regimes. They aren't shipping faster because they're motivated this week — they're shipping faster because something organizational changed underneath them.&lt;/p&gt;

&lt;p&gt;Velocity regime changes correlate with rounds more cleanly than any other single metric I've measured. The lead time is roughly 3 weeks before the announcement. The hit rate, on its own, is around 35% — meaning two-thirds of regime changes don't lead to a round. Some are pivots. Some are post-pivot recoveries. Some are just one engineer who got really excited.&lt;/p&gt;

&lt;p&gt;But when a regime change overlaps with a contributor step function, the hit rate jumps to ~70% in my backtest across Q3-Q4 2025. That's the combination I act on.&lt;/p&gt;

&lt;h2&gt;
  
  
  The part I got wrong
&lt;/h2&gt;

&lt;p&gt;For the first three months I weighted commit &lt;em&gt;count&lt;/em&gt; more heavily than commit &lt;em&gt;velocity change&lt;/em&gt;. I assumed high-volume orgs were higher-quality leads. They weren't. They were just bigger.&lt;/p&gt;

&lt;p&gt;The correction was switching to z-scores within sector. A 200-commit week from a 5-person devtools company is a regime change. A 200-commit week from a 50-person AI infrastructure company is Tuesday. Once I started ranking by &lt;code&gt;(velocity - 6mo_mean) / 6mo_stddev&lt;/code&gt;, the noise dropped and the small-team signals rose to the top. The orgs that ended up raising weren't the loudest — they were the ones whose own quiet baseline broke first.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I actually do every Monday
&lt;/h2&gt;

&lt;p&gt;The workflow is the same every week, no matter what the data shows:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Pull the latest 14 days of &lt;code&gt;PushEvent&lt;/code&gt;s for every org on the watchlist (~4,200 right now).&lt;/li&gt;
&lt;li&gt;Compute the five patterns above for each org.&lt;/li&gt;
&lt;li&gt;Filter to orgs with two or more patterns firing simultaneously.&lt;/li&gt;
&lt;li&gt;For each survivor, open the GitHub org page and skim the last week of commits. If the commits look like real product work — not just bumping versions — the org goes on the call list.&lt;/li&gt;
&lt;li&gt;The call list goes into a doc with a one-line note per company. Some weeks the list is 4 long. Some weeks it's 30.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I don't act on single-pattern firings anymore. Not since the day I almost emailed an investor friend about a fintech that turned out to be running a hackathon. The signal was loud. The signal was wrong. The fix wasn't a better signal — it was forcing two of them to agree before I trusted any of them.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Originally published at &lt;a href="https://signals.gitdealflow.com/blog/5-github-patterns-that-predict-fundraises?utm_source=devto&amp;amp;utm_medium=article&amp;amp;utm_campaign=5-github-patterns-that-predict-fundraises" rel="noopener noreferrer"&gt;signals.gitdealflow.com&lt;/a&gt;. The weekly Signal Report is free — no paywall, no account.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>startup</category>
      <category>datascience</category>
      <category>github</category>
      <category>ai</category>
    </item>
    <item>
      <title>I stopped building dashboards. AI assistants are the new UI.</title>
      <dc:creator>The Data Nerd</dc:creator>
      <pubDate>Fri, 17 Apr 2026 21:14:47 +0000</pubDate>
      <link>https://dev.to/data_nerd/i-stopped-building-dashboards-ai-assistants-are-the-new-ui-c5h</link>
      <guid>https://dev.to/data_nerd/i-stopped-building-dashboards-ai-assistants-are-the-new-ui-c5h</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frq5rpzf8v0byws3aphog.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frq5rpzf8v0byws3aphog.gif" alt="MCP server demo inside Claude" width="720" height="630"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The dashboard nobody visited
&lt;/h2&gt;

&lt;p&gt;I built a startup signal dashboard. It tracked GitHub engineering acceleration across 2,000+ startup organizations and ranked them by commit velocity, contributor growth, and repo expansion. The data was solid. The UI was clean.&lt;/p&gt;

&lt;p&gt;Nobody came back to it.&lt;/p&gt;

&lt;p&gt;Investors signed up, bookmarked the URL, and forgot about it. Because that's not how knowledge workers operate in 2026. They don't open dashboards. They ask their AI assistant.&lt;/p&gt;

&lt;h2&gt;
  
  
  The shift
&lt;/h2&gt;

&lt;p&gt;I had a realization: the best distribution channel for a data product isn't a website. It's being embedded in the tool people already use 8 hours a day.&lt;/p&gt;

&lt;p&gt;For investors using Claude, that means an MCP server.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I built
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://gitdealflow.com/?utm_source=devto&amp;amp;utm_medium=article&amp;amp;utm_campaign=mcp-launch" rel="noopener noreferrer"&gt;VC Deal Flow Signal&lt;/a&gt; monitors GitHub engineering activity across startup organizations and surfaces the ones showing unusual acceleration. The hypothesis: engineering acceleration (measured as the rate of change in commit velocity) is a leading indicator for fundraise announcements, usually by 6 to 12 weeks.&lt;/p&gt;

&lt;p&gt;The MCP server exposes 5 tools:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Tool&lt;/th&gt;
&lt;th&gt;What it does&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;get_trending_startups&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Top 20 startups by engineering acceleration&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;search_startups_by_sector&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Startups ranked within a specific sector (20 sectors)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;get_startup_signal&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Signal profile for a specific startup&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;get_signals_summary&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Dataset overview, formats, and links&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;get_methodology&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;How the signals are calculated&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;All data is fetched live from the public API at &lt;a href="https://signals.gitdealflow.com/?utm_source=devto&amp;amp;utm_medium=article&amp;amp;utm_campaign=mcp-launch" rel="noopener noreferrer"&gt;signals.gitdealflow.com&lt;/a&gt;. No API key required.&lt;/p&gt;

&lt;h2&gt;
  
  
  How I built the MCP server
&lt;/h2&gt;

&lt;p&gt;The server is TypeScript, uses the official &lt;code&gt;@modelcontextprotocol/sdk&lt;/code&gt;, and runs over stdio transport.&lt;/p&gt;

&lt;p&gt;The entire implementation is about 250 lines:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;Server&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;@modelcontextprotocol/sdk/server/index.js&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;StdioServerTransport&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;@modelcontextprotocol/sdk/server/stdio.js&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;server&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Server&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vc-deal-flow-signal&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;1.1.1&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;capabilities&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;tools&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;// Register 5 tools, each fetches live data from the public JSON API&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Each tool fetches live data from &lt;code&gt;signals.gitdealflow.com/api/signals.json&lt;/code&gt; and formats it as structured text. The server doesn't bundle any data — it's a thin wrapper around the public API.&lt;/p&gt;

&lt;h2&gt;
  
  
  Publishing to the MCP ecosystem
&lt;/h2&gt;

&lt;p&gt;Getting the server discoverable took three steps:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Publish to npm&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm publish &lt;span class="nt"&gt;--access&lt;/span&gt; public
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Package: &lt;code&gt;@gitdealflow/mcp-signal&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Publish to the official MCP Registry&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;brew &lt;span class="nb"&gt;install &lt;/span&gt;mcp-publisher
mcp-publisher login github
mcp-publisher publish
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This required a &lt;code&gt;server.json&lt;/code&gt; manifest matching the registry schema and an &lt;code&gt;mcpName&lt;/code&gt; field in &lt;code&gt;package.json&lt;/code&gt; that matches the registry namespace.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Submit to directories&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I submitted to 8 directories in total:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Official MCP Registry (published)&lt;/li&gt;
&lt;li&gt;npm (published)&lt;/li&gt;
&lt;li&gt;awesome-mcp-servers (PR open)&lt;/li&gt;
&lt;li&gt;Glama (approved, A-tier)&lt;/li&gt;
&lt;li&gt;mcp.so (submitted)&lt;/li&gt;
&lt;li&gt;MCP Market (submitted)&lt;/li&gt;
&lt;li&gt;PulseMCP (auto-ingests from registry)&lt;/li&gt;
&lt;li&gt;Cline Marketplace (submitted)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The whole process from zero to published took about 3 hours.&lt;/p&gt;

&lt;h2&gt;
  
  
  Install it
&lt;/h2&gt;

&lt;p&gt;Add to your Claude Desktop or Claude Code config:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"mcpServers"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"vc-deal-flow-signal"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"command"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"npx"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"args"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"-y"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"@gitdealflow/mcp-signal"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then ask Claude: &lt;em&gt;"Which startups are accelerating in fintech?"&lt;/em&gt; or &lt;em&gt;"Show me the signal profile for roboflow."&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I learned
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;MCP servers are the new API.&lt;/strong&gt; If your product serves data, an MCP server is the highest-leverage distribution channel you can build. It took a couple hours to build and gets your product embedded in the user's workflow permanently.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;The ecosystem is early but growing fast.&lt;/strong&gt; The official registry exists, directories are active, and every major AI tool supports MCP. Getting in now means less competition for visibility.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;The best funnel is invisible.&lt;/strong&gt; When an investor asks Claude about startup signals and gets my data, they didn't "visit my website" or "open my app." They used my product without knowing they entered a funnel. That's the future of distribution.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Embedded distribution needs a non-AI counterpart too.&lt;/strong&gt; Not every investor lives inside Claude yet. For the majority still browsing Crunchbase, AngelList, and PitchBook in the browser, I shipped a Chrome extension that injects the signal badge directly onto startup profile pages. Same Isenberg "piggyback" philosophy — show up where the user already is, not where you want them to go. Different surface, same funnel.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Try it
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;MCP server on npm:&lt;/strong&gt; &lt;a href="https://www.npmjs.com/package/@gitdealflow/mcp-signal" rel="noopener noreferrer"&gt;&lt;code&gt;@gitdealflow/mcp-signal&lt;/code&gt;&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Live signals + methodology:&lt;/strong&gt; &lt;a href="https://signals.gitdealflow.com/?utm_source=devto&amp;amp;utm_medium=article&amp;amp;utm_campaign=mcp-launch-cta" rel="noopener noreferrer"&gt;signals.gitdealflow.com&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Chrome extension&lt;/strong&gt; (Crunchbase / AngelList / PitchBook badge): &lt;a href="https://chromewebstore.google.com/detail/hehkgipiamajnnlpkfhpeoeaoaogmknn" rel="noopener noreferrer"&gt;Chrome Web Store&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Site:&lt;/strong&gt; &lt;a href="https://gitdealflow.com/?utm_source=devto&amp;amp;utm_medium=article&amp;amp;utm_campaign=mcp-launch-cta" rel="noopener noreferrer"&gt;gitdealflow.com&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you've shipped an MCP server and want to compare notes on the registry submission flow, drop a comment — happy to share the &lt;code&gt;server.json&lt;/code&gt; manifest pattern that worked.&lt;/p&gt;

</description>
      <category>mcp</category>
      <category>ai</category>
      <category>typescript</category>
      <category>opensource</category>
    </item>
  </channel>
</rss>
