DEV Community

Cover image for Telegram AI Companion: A Fun Rust + Local AI + Telegram Project
Dima Zaichenko
Dima Zaichenko

Posted on

Telegram AI Companion: A Fun Rust + Local AI + Telegram Project

Hey dev.to! πŸ‘‹

Recently I built a small but fun pet project β€” Telegram AI Companion.

It's a Telegram bot that chats with you using a local LLM via LocalAI.

No OpenAI, no clouds β€” everything runs on your own machine! πŸ§ πŸ’»

The goal? Not to reinvent AI, but to explore Rust, async programming, Telegram API, and local LLMs. Think of it as a β€œdeveloper's companion bot”. πŸ˜„


🧩 What It Can Do

βœ… Replies to any message in Telegram

βœ… Works with LocalAI (or OpenAI if you want)

βœ… Runs via Docker + Docker Compose

βœ… Written in Rust with Actix Web

βœ… Has a REST API (/chat) β€” hook up any UI

βœ… Includes tests and has a clean project structure


βš™οΈ How It Works

Overview

  1. User sends a message to the Telegram bot
  2. Telegram calls our webhook (/telegram/webhook)
  3. Rust app sends the prompt to LocalAI
  4. Gets a reply and sends it back to the user

Tech Stack

πŸ¦€ Rust β€” strict but powerful

🌐 Actix Web β€” high-perf async framework

πŸ“¦ Docker & Compose β€” clean and reproducible

🧠 LocalAI β€” local alternative to OpenAI, supports GGUF/LLaMa

☁️ Optional: OpenAI support via .env


πŸš€ Quickstart

Clone the repo:

git clone https://github.com/di-zed/tg-ai-companion
cd tg-ai-companion
Enter fullscreen mode Exit fullscreen mode

Download a model (e.g., Mistral 7B) and configure:

cd models/
wget https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.2-GGUF/resolve/main/mistral-7b-instruct-v0.2.Q4_K_M.gguf
Enter fullscreen mode Exit fullscreen mode

Create mistral.yaml:

name: mistral
backend: llama
parameters:
  model: mistral-7b-instruct-v0.2.Q4_K_M.gguf
  temperature: 0.7
  top_p: 0.9
  top_k: 40
  n_ctx: 4096
Enter fullscreen mode Exit fullscreen mode

Or configure OpenAI in .env:

OPEN_AI_URL=http://localai:8080
OPEN_AI_MODEL=mistral
OPEN_AI_API_KEY=your_openai_key
Enter fullscreen mode Exit fullscreen mode

Start the app (don't forget to edit .env):

cp .env.sample .env
cp volumes/root/.bash_history.sample volumes/root/.bash_history

docker-compose up --build
docker-compose exec rust bash
cargo run
Enter fullscreen mode Exit fullscreen mode

Now your bot runs locally, and LocalAI listens on localhost:8080.


πŸ€– Create Your Telegram Bot

  1. Open Telegram and talk to @BotFather
  2. Run /newbot, set a name and a unique username (something_bot)
  3. You'll get a token like:
123456789:AAH6kDkKvkkkT-PWTwMg6cYtHEb3vY_tS1k
Enter fullscreen mode Exit fullscreen mode

Paste it into .env:

TELEGRAM_BOT_TOKEN=your_token_here
Enter fullscreen mode Exit fullscreen mode

🌍 Expose Webhook via ngrok

Make your local server reachable:

ngrok http 80
Enter fullscreen mode Exit fullscreen mode

Then set webhook:

curl -X POST "https://api.telegram.org/bot<YOUR_TOKEN>/setWebhook" \
     -H "Content-Type: application/json" \
     -d '{"url": "https://your-subdomain.ngrok-free.app/telegram/webhook"}'
Enter fullscreen mode Exit fullscreen mode

πŸ” API Mode (No Telegram)

You can also call it like a standard LLM API:

POST /chat

Header: Authorization: Bearer YOUR_TOKEN

{
  "prompt": "Hi, who are you?"
}
Enter fullscreen mode Exit fullscreen mode

LocalAI (or OpenAI) responds.


🧠 Why I Built This

Main goals:

  • Learn Rust hands-on
  • Explore local LLMs without API keys
  • Build something fun and useful
  • Play with Telegram bots 😎

This can be a base for future AI bots with memory, content generation, assistants, and more.


πŸ“… What’s Next?

  • Memory + conversation context
  • Web interface
  • Multi-model support

πŸ’¬ Final Thoughts

If you're just starting with Rust or want to try local LLMs β€” this might be a perfect playground.

The code is clean, the stack is modern, and setup is smooth.

I kept this post light β€” for deep dives, check the full README:

πŸ”— GitHub: tg-ai-companion


πŸ”— Useful Links

🧠 LocalAI β€” LLM backend

πŸ¦€ Rust Book β€” start here

☁️ ngrok β€” webhook tunneling

Thanks for reading! πŸ™Œ

If the bot responds cheerfully β€” that’s on me.

If it’s silent β€” blame Telegram or ngrok πŸ˜„

Top comments (1)

Collapse
 
alex_238be592ca4b365382f1 profile image
Alex

Cool project! The local LLM approach is interesting - no API costs is a real advantage for side projects. I went a different direction with my TG bot (shared checklists, @dunitbot) - Kotlin + inline mode so it works in any chat without adding the bot to the group. The Telegram Bot API is surprisingly capable once you dig into inline queries and callback handlers. Did you consider any non-AI use cases while exploring the API?