DEV Community

Cover image for hdcd-telegram: A 3.5 MB Rust Drop-in for Claude Code's 100 MB Telegram Plugin
Maciej Ostaszewski
Maciej Ostaszewski

Posted on

hdcd-telegram: A 3.5 MB Rust Drop-in for Claude Code's 100 MB Telegram Plugin

The Problem

If you use Claude Code with the official Telegram channel plugin, you've probably hit one of these:

  • "Channels are not currently available" (#36503, 24+ comments)
  • Bot shows "typing" but never replies (#37933)
  • Plugin stops after the first turn (#36477)
  • Zombie processes keep polling after session ends, causing 409 Conflict on the next launch

The official plugin runs on Bun -- a JavaScript runtime. It works, but it's heavy: ~100 MB RAM per instance, 2-3 second startup, and it doesn't clean up after itself.

If you run multiple Claude Code agents in parallel (CI, orchestrator setups, dev fleet), each one spawns its own Bun process. 10 agents = 10 runtimes = ~1 GB just for Telegram bridges.

The Solution: hdcd-telegram

We built hdcd-telegram -- a Rust drop-in replacement with full feature parity.

Bun (official) Rust (hdcd-telegram)
Binary size ~100 MB (runtime + deps) 3.5 MB
RAM per instance ~100 MB ~5 MB
10 parallel agents ~1 GB ~50 MB
Startup 2-3 seconds <50 ms
Shutdown Keeps polling (zombie) Immediate on stdin EOF

How it connects

Telegram  --->  hdcd-telegram  --->  Claude Code
  (user)        (MCP server)         (session)
          <---  (stdio/JSON-RPC) <---
Enter fullscreen mode Exit fullscreen mode

hdcd-telegram runs as an MCP server subprocess. Claude Code starts it, communicates over stdio, and the binary handles all Telegram API interaction. No ports opened, no webhooks -- just outbound HTTPS to api.telegram.org.

It Also Works Around the Channels Bug

The --channels plugin:telegram@claude-plugins-official activation path is broken for many users due to a server-side feature flag. The workaround:

  1. Register the server in .mcp.json
  2. Launch with --dangerously-load-development-channels server:telegram

This uses a different code path that bypasses the buggy plugin resolution. hdcd-telegram is designed for this workflow.

Quick Start

1. Download the binary from GitHub Releases (Linux, macOS, Windows).

2. Save your bot token:

mkdir -p ~/.claude/channels/telegram
echo "TELEGRAM_BOT_TOKEN=your-token-here" > ~/.claude/channels/telegram/.env
chmod 600 ~/.claude/channels/telegram/.env
Enter fullscreen mode Exit fullscreen mode

3. Add to .mcp.json:

{
  "mcpServers": {
    "telegram": {
      "command": "/path/to/hdcd-telegram",
      "args": []
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

4. Launch:

claude --dangerously-load-development-channels server:telegram
Enter fullscreen mode Exit fullscreen mode

5. Pair -- DM your bot, get a code, run /telegram:access pair <code>.

Full setup guide (including group chat configuration): README

Features

  • All 8 message types: text, photo, document, voice, audio, video, video note, sticker
  • 4 MCP tools: reply (with chunking, threading, attachments, MarkdownV2), react, edit_message, download_attachment
  • Access control: pairing flow, per-user allowlists, group policies with @mention gating
  • Permission relay: inline keyboard for remote tool-use approval/denial
  • Voice transcription: automatic speech-to-text via OpenAI Whisper running locally -- no data leaves your machine
  • Clean shutdown: immediate exit on stdin EOF, no zombie polling

Zero Migration

If you already use the official plugin, hdcd-telegram reads the same files:

  • ~/.claude/channels/telegram/.env (bot token)
  • ~/.claude/channels/telegram/access.json (allowlist, groups)

Just swap the binary path in .mcp.json -- all pairings and settings are preserved.

Why We Built This

We're HyperDev -- we build tooling for multi-agent Claude Code deployments. Our SDLC Orchestrator runs 4+ Claude Code agents in parallel, each needing its own Telegram bridge. At that scale, the Bun runtime overhead adds up fast.

hdcd-telegram started as an internal tool and we open-sourced it because everyone hitting the channels bugs deserves a working alternative.


Links:

Questions? Open an issue or find us in the comments.

Top comments (0)