DEV Community

Cover image for I built an MCP server that tells Claude when ChatGPT is down (and how much it costs)
imviky-ctrl
imviky-ctrl

Posted on

I built an MCP server that tells Claude when ChatGPT is down (and how much it costs)

Every developer using AI tools has hit this at some point:

You open Claude or Cursor, something feels slow or broken, and you spend 10 minutes wondering if it's your code, your network, or the
service itself. Then you go check a status page. Then you check another one. Then you give up.

I built Tickerr to solve this — independent uptime monitoring for 42+ AI tools, updated every 5 minutes. Today I'm
releasing the MCP server so your AI assistant can answer these questions for you, mid-conversation.

## What it does

The Tickerr MCP server gives Claude, Cursor, and Windsurf direct access to:

  • Live status — is any AI tool up or down right now?
  • Uptime history — 30-day and 90-day uptime percentages
  • Incidents — last 90 days of outages from 26 official provider status pages
  • API pricing — current cost per 1M tokens across all major models
  • Rate limits — plan-by-plan limits for any AI tool
  • Free tier comparison — best free plans by category

No API key. No signup. Just install and ask.

## Install in 30 seconds

Claude Code:

  claude mcp add tickerr --transport http --url https://tickerr.ai/mcp
Enter fullscreen mode Exit fullscreen mode


json

  Cursor / Windsurf — add to ~/.cursor/mcp.json:
  {
    "mcpServers": {
      "tickerr": { 
        "url": "https://tickerr.ai/mcp"
      }                                                                                                                                       
    }  
  }                                                                                                                                           
Enter fullscreen mode Exit fullscreen mode
 Claude Desktop — add to claude_desktop_config.json:
  {                                                                                                                                           
    "mcpServers": {
      "tickerr": {                                                                                                                            
        "command": "npx",                                                                                                                   
        "args": ["-y", "tickerr-mcp"]
      }                              
    }                                                                                                                                         
  }
Enter fullscreen mode Exit fullscreen mode

What you can now ask your AI assistant

Once installed, you can ask things like:

▎ "Is Claude down right now?"

▎ "What's the cheapest model for processing 100K input tokens and 10K output tokens?"

▎ "Has OpenAI had any incidents this month?"

▎ "What are Cursor's rate limits on the Pro plan?"

▎ "Compare Claude Haiku vs GPT-4o Mini for a high-volume use case"

▎ "Which coding AI tools have a free tier?"

Your assistant will call the live Tickerr API and answer with real data — not training data from months ago.

  The 7 tools exposed                                                                                                                         

  ┌─────────────────┬─────────────────────────────────────────────┐                                                                           
  │      Tool       │                What it does                 │                                                                         
  ├─────────────────┼─────────────────────────────────────────────┤
  │ get_tool_status │ Live status + uptime % for any AI tool      │
  ├─────────────────┼─────────────────────────────────────────────┤
  │ get_incidents   │ Historical incidents from the last 90 days  │                                                                           
  ├─────────────────┼─────────────────────────────────────────────┤                                                                           
  │ get_api_pricing │ Current pricing per model, cheapest first   │                                                                           
  ├─────────────────┼─────────────────────────────────────────────┤                                                                           
  │ get_rate_limits │ Plan-by-plan rate limits                    │                                                                         
  ├─────────────────┼─────────────────────────────────────────────┤                                                                           
  │ compare_pricing │ Rank models by cost for your token workload │
  ├─────────────────┼─────────────────────────────────────────────┤                                                                           
  │ get_free_tier   │ Best free plans by category                 │                                                                         
  ├─────────────────┼─────────────────────────────────────────────┤                                                                           
  │ list_tools      │ All 42+ monitored tools with slugs          │                                                                         
  └─────────────────┴─────────────────────────────────────────────┘                                                                           
Enter fullscreen mode Exit fullscreen mode

How it works under the hood

The MCP server is available two ways:

  1. HTTP endpoint at https://tickerr.ai/mcp — implements the MCP Streamable HTTP transport, so Claude Code and Cursor can connect directly with no local process
  2. npm package tickerr-mcp — stdio transport via npx, for clients that don't support HTTP transport yet

The HTTP endpoint is a Next.js App Router route handler that implements the MCP JSON-RPC protocol manually — no extra dependencies, just the
existing Supabase queries the site already uses. Each tool call hits the database directly and returns formatted markdown.

Data is sourced from:

  • Status: independent HTTP checks every 5 minutes
  • Incidents: 26 official provider status pages (Statuspage, Atlassian, etc.)
  • Pricing: daily scrapes from official provider documentation
  • Rate limits: manually maintained, updated when providers announce changes

Links

  • Install: claude mcp add tickerr --transport http --url https://tickerr.ai/mcp
  • npm: tickerr-mcp
  • GitHub: imviky-ctrl/tickerr-mcp
  • Web dashboard: tickerr.ai
  • Smithery: tickerr-live-status

If you try it, let me know what tools or data you'd want added.

Top comments (0)