Drive chatbots across QQ, WeChat, Telegram, Discord, and more using visual workflows - no coding required.
LangBot is an open-source instant messaging bot platform that connects AI workflow engines like Langflow, n8n, Dify, FastGPT, and Coze to platforms including WeChat, QQ, Feishu, DingTalk, Telegram, Discord, Slack, and LINE. This tutorial demonstrates how to use Langflow's visual workflows as LangBot's conversation engine.
Why This Approach Works
- True Multi-Platform: One workflow powering 8+ messaging platforms simultaneously
- Visual Orchestration: Drag-and-drop conversation design with conditional branches, multi-turn dialogs, and external API calls
- Flexible AI Models: Support for OpenAI, Claude, Gemini, DeepSeek, and local models
- Fully Open Source: Both LangBot and Langflow are open-source projects for free deployment and customization
Prerequisites
- Python 3.10+
- Docker (recommended for quick deployment)
- OpenAI API Key or API keys for other LLM services
Step 1: Deploy LangBot
Launch with uvx in one command:
uvx langbot
First run auto-initializes and opens your browser to http://127.0.0.1:5300.
After registration, log in to access the dashboard:
Step 2: Deploy Langflow
Deploy quickly with Docker:
docker run -d --name langflow -p 7860:7860 langflowai/langflow:latest
Visit http://localhost:7860 to access Langflow:
Step 3: Create a Langflow Workflow
In Langflow, select the "Basic Prompting" template to get started quickly:
This template includes four basic components:
- Chat Input: Receives user messages
- Prompt: Sets system instructions
- Language Model: Calls LLM to generate responses
- Chat Output: Returns results
Configure Language Model
Click the Language Model component and configure:
- Model Provider: Select OpenAI (or other compatible providers like SiliconFlow, New API)
- Model Name: Select gpt-4o-mini or deepseek-chat
- OpenAI API Key: Enter your API Key
Tip: You can use OpenAI-compatible API services like SiliconFlow or New API by simply modifying the Base URL.
Save the workflow after configuration.
Step 4: Get Langflow API Information
Generate API Key
In Langflow's upper right: Settings → API Keys, navigate to the API Keys page:
Click Create New Key:
Generate and save the API Key:
Format: sk-xxxxxxxxxxxxxxxxxxxxxxxx
Get Flow ID
Extract from the flow editor's URL:
http://localhost:7860/flow/{flow-id}
Record this flow-id.
Step 5: Configure Langflow in LangBot
Return to LangBot dashboard and go to Pipelines page.
Click ChatPipeline to edit, in the AI tab:
Configure Runner, select Langflow API:
Fill in the Langflow configuration:
Configuration items:
-
Base URL:
http://localhost:7860(local) orhttp://langflow:7860(Docker network) - API Key: The API Key generated in Langflow
- Flow ID: The Flow ID recorded earlier
Docker Tip: If both services run in containers, ensure they're on the same network and use the container name for Base URL.
Click Save to save the configuration.
Step 6: Test the Conversation
Click Debug Chat on the Pipelines page to open the debug chat interface:
Enter a test message like "Hello" and view the AI response:
How It Works
- User sends a message on a messaging platform
- LangBot receives and passes it to the Pipeline
- Pipeline calls Langflow API
- Langflow executes the workflow: receives input → adds prompt → calls LLM → returns result
- LangBot sends the response back to the user
Common Issues
Cannot connect to Langflow?
Check Base URL. For Docker deployment, ensure containers are on the same network:
docker network create langbot_network
docker network connect langbot_network langflow
docker network connect langbot_network langbot
Use container name: http://langflow:7860
API call fails?
- Confirm API Key and Flow ID are correct
- Verify the Language Model in Langflow has a valid LLM API Key configured
Advanced Use Cases
Langflow's power lies in visually orchestrating complex AI workflows:
- Multi-Turn Memory: Add Memory components for contextual understanding
- Conditional Branches: Execute different logic based on user input
- External API Integration: Connect databases, search engines, third-party services
- Multi-Agent Collaboration: Multiple LLM models working together
- RAG Applications: Integrate vector databases for knowledge base Q&A
All achievable through drag-and-drop without writing code.
Summary
With LangBot + Langflow, you can rapidly build powerful multi-platform AI chatbots. Langflow provides visual workflow orchestration, LangBot handles messaging platform integration - together they create a complete loop from workflow design to multi-platform deployment.
This approach is ideal for:
- Scenarios requiring the same AI capabilities across multiple platforms
- Teams wanting rapid iteration and testing of different conversation flows
- Developers wanting to build complex AI applications without deep coding
Related Resources
This article is based on the latest version of LangBot. LangBot supports integration with Dify, n8n, FastGPT, Coze, and other AI platforms - choose the workflow engine that best fits your needs.















Top comments (0)