I'm Kirill Strelnikov, a freelance developer in Barcelona. I build Telegram bots, AI chatbots, and SaaS platforms. This is the story of a Telegram AI aggregator bot I built that hit 500 paying users in 3 months — and was profitable from month 1.
The Idea
The client wanted a Telegram bot that aggregates multiple AI models (GPT-4, Claude, Gemini) into a single chat interface. Users pick a model, send a message, get a response — all without leaving Telegram. The monetization model: credit-based billing.
Simple concept. The execution is where it gets interesting.
Architecture
Telegram user
→ aiogram bot handler
→ Django backend (REST API)
→ Credit system checks balance
→ Route to selected AI model (OpenAI / Anthropic / Google)
→ Stream response back to Telegram
→ Deduct credits
Key components:
1. Credit-based billing system
class UserCredits(models.Model):
user = models.OneToOneField(User, on_delete=models.CASCADE)
balance = models.DecimalField(max_digits=10, decimal_places=2, default=0)
def deduct(self, model: str, tokens: int) -> bool:
cost = self.calculate_cost(model, tokens)
if self.balance >= cost:
self.balance -= cost
self.save(update_fields=["balance"])
CreditTransaction.objects.create(
user=self.user, amount=-cost,
model=model, tokens=tokens
)
return True
return False
def calculate_cost(self, model: str, tokens: int) -> Decimal:
rates = {
"gpt-4": Decimal("0.003"), # per 1K tokens
"gpt-3.5": Decimal("0.0005"),
"claude-3": Decimal("0.002"),
"gemini": Decimal("0.001"),
}
return rates.get(model, Decimal("0.002")) * (tokens / 1000)
2. Multi-model router
Each AI provider has its own adapter class. The router picks the right one based on user selection:
class ModelRouter:
adapters = {
"gpt-4": OpenAIAdapter,
"claude-3": AnthropicAdapter,
"gemini": GoogleAdapter,
}
async def generate(self, model: str, messages: list) -> str:
adapter = self.adapters[model]()
return await adapter.chat(messages)
3. Telegram payment integration
Users buy credits directly in Telegram using Stripe:
@router.message(Command("buy"))
async def buy_credits(message: Message):
prices = [
LabeledPrice(label="100 credits", amount=499), # $4.99
LabeledPrice(label="500 credits", amount=1999), # $19.99
LabeledPrice(label="2000 credits", amount=5999), # $59.99
]
# Send Telegram invoice
await message.answer_invoice(
title="AI Credits",
description="Credits for AI model usage",
payload="credits_purchase",
provider_token=STRIPE_TOKEN,
currency="USD",
prices=[prices[0]], # Default package
)
Growth Metrics
| Month | Users | Paying users | Revenue | MRR |
|---|---|---|---|---|
| 1 | 800 | 85 | $1,200 | $1,200 |
| 2 | 2,100 | 280 | $3,800 | $3,800 |
| 3 | 3,500 | 500 | $6,200 | $6,200 |
Conversion rate: ~14% free-to-paid (industry average for bots is 2-5%).
Why so high? Free tier gives 20 credits — enough to try all models but not enough for daily use. The "aha moment" happens when users compare GPT-4 vs Claude on the same prompt. Once they see the value of model switching, they pay.
What Made It Work
1. Telegram's 900M+ user base
No app to install. No signup form. Users just open the bot and start chatting. The friction-to-value ratio is unbeatable.
2. Credit-based pricing (not subscription)
Users pay for what they use. Heavy users spend $20-60/month. Light users spend $5. Nobody feels locked into a subscription they don't use. This increased retention significantly vs subscription-only models.
3. Message open rates: 80%+
Telegram notifications actually get read. When we sent "New model added: Claude 3 Opus" — 82% open rate. Compare that to email marketing (20-25%).
4. Viral inline mode
Users can use the bot inline in any chat: type @botname what is Django? in a group chat and get an AI answer. This drove organic growth — other group members see the bot in action and try it themselves.
Tech Stack
- Bot framework: Python 3.12 + aiogram 3.x (async)
- Backend: Django 5.0 + Django REST Framework
- Database: PostgreSQL (user accounts, credits, transactions)
- Cache: Redis (rate limiting, session data)
- Task queue: Celery (webhook processing, analytics)
- AI providers: OpenAI, Anthropic, Google AI APIs
- Payments: Stripe via Telegram Payments API
- Deployment: Docker + Nginx on a VPS
Lessons Learned
Monetize from day 1. Don't wait for "enough users." The free tier should be a taste, not a full meal.
Credits > subscriptions for bots. Bot usage is spiky. People use it heavily for a week, then pause. Credits accommodate this pattern naturally.
Telegram bots are underrated as a business channel. 900M+ MAU, 80%+ open rates, built-in payments, inline sharing. For many use cases, a Telegram bot is better than a mobile app.
Multi-model is the moat. Any single-model wrapper gets commoditized instantly. The value is in comparison and switching between models.
Cost to Build Something Similar
Based on my experience building this and similar bots:
| Complexity | Timeline | Cost |
|---|---|---|
| Simple bot (1 model, basic billing) | 1-2 weeks | EUR 500-1,000 |
| Multi-model with credits | 3-4 weeks | EUR 1,500-3,000 |
| Full platform with admin, analytics | 5-8 weeks | EUR 3,000-6,000 |
I'm Kirill Strelnikov, freelance Python developer in Barcelona, Spain. I build Telegram bots, AI chatbots, and SaaS platforms for businesses across Europe. 15+ projects delivered.
- Website: kirweb.site
- Telegram: @KirBcn
- Cost guide: How much does a Telegram bot cost?
Top comments (2)
Some comments may only be visible to logged-in visitors. Sign in to view all comments.