DEV Community

linou518
linou518

Posted on

Connecting LINE to OpenClaw: Building a Custom Bridge When the Official Plugin Failed

Connecting LINE to OpenClaw: Building a Custom Bridge When the Official Plugin Failed

The official plugin didn't work. So I built my own. Simple story — but there were enough gotchas along the way that it's worth documenting.

Background

We're preparing to sell GMK AI Mini PCs with OpenClaw pre-installed. The requirement for Japanese users: support Telegram, Slack, and LINE across all three channels. Telegram and Slack worked without issue. LINE was the problem.

Why the Official LINE Plugin Failed

OpenClaw ships with an official LINE plugin (I'd tried it on a Pi4 previously). I started there this time too.

The short version: the bindings configuration was wrong and caused a Gateway crash loop. The accounts.default field is apparently required, but the documentation is thin. After several attempts to fix openclaw.json and watching it restart into the same crash, I gave up.

Fighting the official plugin wasn't worth the time. Writing my own was faster.

The Custom LINE Bridge

The architecture is simple:

LINE servers → cloudflared tunnel → line_bridge.py (Flask) → OpenClaw chatCompletions API → AI reply
Enter fullscreen mode Exit fullscreen mode

Key points in line_bridge.py

The LINE webhook receives a POST, passes the user's message to OpenClaw's chatCompletions API, and returns the response via the LINE Reply API.

@app.route("/webhook", methods=["POST"])
def webhook():
    body = request.get_json()
    for event in body.get("events", []):
        if event["type"] == "message":
            user_msg = event["message"]["text"]
            reply_token = event["replyToken"]

            # Call OpenClaw chatCompletions API
            ai_resp = call_openclaw(user_msg)

            # LINE Reply
            line_reply(reply_token, ai_resp)
    return "OK"
Enter fullscreen mode Exit fullscreen mode

The OpenClaw chatCompletions API is disabled by default. You need to add this to openclaw.json:

"gateway": {
  "http": {
    "endpoints": {
      "chatCompletions": {
        "enabled": true
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Gotcha: Writing gateway.chatCompletions: true gives you an "Unrecognized key" error and crashes the Gateway. The correct path is gateway.http.endpoints.chatCompletions.enabled.

Receiving webhooks via cloudflared

LINE Bot webhooks require an HTTPS URL. To expose a local server without port forwarding, cloudflared is the fastest option:

cloudflared tunnel --url http://localhost:3900
Enter fullscreen mode Exit fullscreen mode

This generates a random URL (e.g. https://substance-deeper-pharmacy-councils.trycloudflare.com). Set that as the Webhook URL in the LINE Developers console and you're done.

Running as a systemd service

Keep line_bridge.py running as a systemd user service:

[Unit]
Description=LINE Bridge for OpenClaw

[Service]
ExecStart=/usr/bin/python3 /home/openclaw/line_bridge.py
Restart=always

[Install]
WantedBy=default.target
Enter fullscreen mode Exit fullscreen mode

Results

Full flow verified: LINE → line_bridge → OpenClaw → AI reply. Webhook Verify passed. Japanese conversation worked without issues.

Lessons Learned

  1. When the official OpenClaw plugin fails, a custom bridge via the chatCompletions API is the pragmatic path
  2. Wrong chatCompletions API enable path crashes the Gateway — the correct path is gateway.http.endpoints.chatCompletions.enabled
  3. cloudflared is the fastest way to expose a local server over HTTPS — no certificates, no port forwarding required
  4. Don't use streaming: partial with LINE — it delivers the response one character at a time, which is a terrible UX. Use off.

When the official path is blocked, reconnect at the API level. That's usually the fastest route.


Tags: OpenClaw LINE Bot cloudflared Python Flask HomeServer AIAgent GMKMiniPC

Top comments (0)