DEV Community

Junyan Qin (Chin)
Junyan Qin (Chin)

Posted on

How I Connected Dify Chatflow to My Multi-Platform Bot in Minutes

LangBot is a multi-platform chatbot framework that supports QQ, WeChat, Discord, Telegram, Slack, LINE, DingTalk, and Lark. One of its most powerful features is the support for multiple AI runners, including workflow platforms like Dify, n8n, Langflow, and Coze, as well as direct integration with OpenAI, Claude, and other LLM services.

Dify is an open-source LLM application development platform with visual Chatflow functionality, allowing you to build intelligent conversational applications without coding. This guide shows you how to integrate Dify Chatflow into LangBot to give your bot powerful AI capabilities.

What You Need

  • A running LangBot instance
  • A Dify account (you can use cloud.dify.ai)

Step 1: Deploy LangBot

If you haven't deployed LangBot yet, quickly start it with Docker:

docker run -d \
  -p 5300:5300 \
  -v ./data:/app/data \
  --name langbot \
  rockchin/langbot:latest
Enter fullscreen mode Exit fullscreen mode

Visit http://localhost:5300 to access the management interface.

Default credentials:

LangBot Management Interface

Step 2: Create Dify Chatflow Application

Visit cloud.dify.ai and log in. On the Studio page, click "Create from Blank" to create a new application.

Dify Studio

Select "Chatflow" type - a conversational application with workflow support. Enter an application name (e.g., "LangBot Demo Chatflow") and create it.

Create Chatflow

Step 3: Configure the Workflow

After creation, you'll see a visual workflow editor with three default nodes:

  • START: Receives user input
  • LLM: Processes conversation
  • ANSWER: Returns response

Workflow Editor

Click the LLM node and add a system prompt in the "System" section:

You are a helpful AI assistant integrated with LangBot. You can answer questions and help users with various tasks.
Enter fullscreen mode Exit fullscreen mode

Configure Prompt

You can add more nodes as needed - knowledge base retrieval, external API calls, conditional logic, and more to build more powerful conversational capabilities.

Step 4: Get API Key

Click "Publish" in the top right corner to publish your application, then click "API Access" to enter the API access page.

API Access

Click "Create new Secret key" to create an API key. Copy the generated key (format: app-xxxxxxxxxxxxxxxxxxxxxxxx).

Create Key

Note down:

  • API Server: https://api.dify.ai/v1
  • API Key: The key you just created

Step 5: Configure LangBot

Return to the LangBot management interface, go to the "Pipelines" page, and click "+" to create a new pipeline.

Enter pipeline information:

  • Name: Dify Chatflow Pipeline
  • Description: Pipeline using Dify Chatflow as the AI runner

Switch to the "AI" tab, select "Dify Service API" in the "Runner" dropdown, and configure parameters:

  • Base URL: https://api.dify.ai/v1
  • App Type: Chat (Important: Chatflow apps use Chat type, not Workflow)
  • API Key: Paste your API key

Configure Runner

Click "Save" to save.

Step 6: Test It

In the pipeline edit dialog, click "Debug Chat" to enter the debug interface. Enter a test message (like "Hello! This is a test message.") and press Enter.

Test Success

If configured correctly, you'll see a reply from Dify Chatflow, indicating successful integration.

Common Issues

Error: not_workflow_app
This is due to incorrect App Type configuration. Chatflow apps must use "Chat" type, not "Workflow" type.

Invalid API Key
Ensure the key format is correct (starts with app-), has been created and activated in Dify, and Base URL is set to https://api.dify.ai/v1.

Connection Timeout
Check network connectivity, Dify service accessibility, and firewall settings.

What's Next

Now you can bind this pipeline to your bot and let it process user messages through Dify Chatflow. In Dify, you can further:

  • Add knowledge bases for RAG (Retrieval Augmented Generation)
  • Integrate external tools and APIs
  • Use conditional nodes for complex logic
  • Add variable transformations and data processing

LangBot supports configuring multiple pipelines simultaneously, allowing you to configure different AI capabilities for different scenarios and platforms. It also supports other AI platforms like n8n, Langflow, FastGPT, Coze, as well as direct integration with OpenAI, Claude, Google Gemini, and other LLM services.

Related Resources

Top comments (0)