<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: imviky-ctrl</title>
    <description>The latest articles on DEV Community by imviky-ctrl (@imvikyctrl).</description>
    <link>https://dev.to/imvikyctrl</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/imvikyctrl"/>
    <language>en</language>
    <item>
      <title>I built an MCP server that tells Claude when ChatGPT is down (and how much it costs)</title>
      <dc:creator>imviky-ctrl</dc:creator>
      <pubDate>Sun, 12 Apr 2026 13:18:14 +0000</pubDate>
      <link>https://dev.to/imvikyctrl/i-built-an-mcp-server-that-tells-claude-when-chatgpt-is-down-and-how-much-it-costs-4dcf</link>
      <guid>https://dev.to/imvikyctrl/i-built-an-mcp-server-that-tells-claude-when-chatgpt-is-down-and-how-much-it-costs-4dcf</guid>
      <description>&lt;p&gt;Every developer using AI tools has hit this at some point:&lt;/p&gt;

&lt;p&gt;You open Claude or Cursor, something feels slow or broken, and you spend 10 minutes wondering if it's your code, your network, or the&lt;br&gt;
  service itself. Then you go check a status page. Then you check another one. Then you give up.&lt;/p&gt;

&lt;p&gt;I built &lt;a href="https://tickerr.ai" rel="noopener noreferrer"&gt;Tickerr&lt;/a&gt; to solve this — independent uptime monitoring for 42+ AI tools, updated every 5 minutes. Today I'm&lt;br&gt;
  releasing the MCP server so your AI assistant can answer these questions &lt;em&gt;for you&lt;/em&gt;, mid-conversation.&lt;/p&gt;

&lt;p&gt;## What it does&lt;/p&gt;

&lt;p&gt;The Tickerr MCP server gives Claude, Cursor, and Windsurf direct access to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Live status&lt;/strong&gt; — is any AI tool up or down right now?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Uptime history&lt;/strong&gt; — 30-day and 90-day uptime percentages&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Incidents&lt;/strong&gt; — last 90 days of outages from 26 official provider status pages&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;API pricing&lt;/strong&gt; — current cost per 1M tokens across all major models&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Rate limits&lt;/strong&gt; — plan-by-plan limits for any AI tool&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Free tier comparison&lt;/strong&gt; — best free plans by category&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;No API key. No signup. Just install and ask.&lt;/p&gt;

&lt;p&gt;## Install in 30 seconds&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Claude Code:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;  claude mcp add tickerr &lt;span class="nt"&gt;--transport&lt;/span&gt; http &lt;span class="nt"&gt;--url&lt;/span&gt; https://tickerr.ai/mcp
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;br&gt;
json&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  Cursor / Windsurf — add to ~/.cursor/mcp.json:
  {
    "mcpServers": {
      "tickerr": { 
        "url": "https://tickerr.ai/mcp"
      }                                                                                                                                       
    }  
  }                                                                                                                                           
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; Claude Desktop — add to claude_desktop_config.json:
  {                                                                                                                                           
    "mcpServers": {
      "tickerr": {                                                                                                                            
        "command": "npx",                                                                                                                   
        "args": ["-y", "tickerr-mcp"]
      }                              
    }                                                                                                                                         
  }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;What you can now ask your AI assistant                                                                                                    &lt;/p&gt;

&lt;p&gt;Once installed, you can ask things like:&lt;/p&gt;

&lt;p&gt;▎ "Is Claude down right now?"                                                                                                               &lt;/p&gt;

&lt;p&gt;▎ "What's the cheapest model for processing 100K input tokens and 10K output tokens?"                                                       &lt;/p&gt;

&lt;p&gt;▎ "Has OpenAI had any incidents this month?"                                                                                                &lt;/p&gt;

&lt;p&gt;▎ "What are Cursor's rate limits on the Pro plan?"                                                                                          &lt;/p&gt;

&lt;p&gt;▎ "Compare Claude Haiku vs GPT-4o Mini for a high-volume use case"                                                                          &lt;/p&gt;

&lt;p&gt;▎ "Which coding AI tools have a free tier?"                                                                                                 &lt;/p&gt;

&lt;p&gt;Your assistant will call the live Tickerr API and answer with real data — not training data from months ago.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  The 7 tools exposed                                                                                                                         

  ┌─────────────────┬─────────────────────────────────────────────┐                                                                           
  │      Tool       │                What it does                 │                                                                         
  ├─────────────────┼─────────────────────────────────────────────┤
  │ get_tool_status │ Live status + uptime % for any AI tool      │
  ├─────────────────┼─────────────────────────────────────────────┤
  │ get_incidents   │ Historical incidents from the last 90 days  │                                                                           
  ├─────────────────┼─────────────────────────────────────────────┤                                                                           
  │ get_api_pricing │ Current pricing per model, cheapest first   │                                                                           
  ├─────────────────┼─────────────────────────────────────────────┤                                                                           
  │ get_rate_limits │ Plan-by-plan rate limits                    │                                                                         
  ├─────────────────┼─────────────────────────────────────────────┤                                                                           
  │ compare_pricing │ Rank models by cost for your token workload │
  ├─────────────────┼─────────────────────────────────────────────┤                                                                           
  │ get_free_tier   │ Best free plans by category                 │                                                                         
  ├─────────────────┼─────────────────────────────────────────────┤                                                                           
  │ list_tools      │ All 42+ monitored tools with slugs          │                                                                         
  └─────────────────┴─────────────────────────────────────────────┘                                                                           
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;How it works under the hood                                                                                                                 &lt;/p&gt;

&lt;p&gt;The MCP server is available two ways:                                                                                                       &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;HTTP endpoint at &lt;a href="https://tickerr.ai/mcp" rel="noopener noreferrer"&gt;https://tickerr.ai/mcp&lt;/a&gt; — implements the MCP Streamable HTTP transport, so Claude Code and Cursor can connect directly
with no local process
&lt;/li&gt;
&lt;li&gt;npm package tickerr-mcp — stdio transport via npx, for clients that don't support HTTP transport yet
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The HTTP endpoint is a Next.js App Router route handler that implements the MCP JSON-RPC protocol manually — no extra dependencies, just the&lt;br&gt;
   existing Supabase queries the site already uses. Each tool call hits the database directly and returns formatted markdown.                 &lt;/p&gt;

&lt;p&gt;Data is sourced from:                                                                                                                     &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Status: independent HTTP checks every 5 minutes&lt;/li&gt;
&lt;li&gt;Incidents: 26 official provider status pages (Statuspage, Atlassian, etc.)
&lt;/li&gt;
&lt;li&gt;Pricing: daily scrapes from official provider documentation
&lt;/li&gt;
&lt;li&gt;Rate limits: manually maintained, updated when providers announce changes
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Links                                                                                                                                       &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Install: claude mcp add tickerr --transport http --url &lt;a href="https://tickerr.ai/mcp" rel="noopener noreferrer"&gt;https://tickerr.ai/mcp&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;npm: tickerr-mcp&lt;/li&gt;
&lt;li&gt;GitHub: imviky-ctrl/tickerr-mcp
&lt;/li&gt;
&lt;li&gt;Web dashboard: tickerr.ai
&lt;/li&gt;
&lt;li&gt;Smithery: tickerr-live-status
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you try it, let me know what tools or data you'd want added.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>mcp</category>
      <category>claude</category>
      <category>webdev</category>
    </item>
  </channel>
</rss>
