<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Tomáš Grasl</title>
    <description>The latest articles on DEV Community by Tomáš Grasl (@tom_grasl_798b66e85f7aa).</description>
    <link>https://dev.to/tom_grasl_798b66e85f7aa</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/tom_grasl_798b66e85f7aa"/>
    <language>en</language>
    <item>
      <title>I Built an MCP Server for OpenClaw — Now My AIs Talk to Each Other</title>
      <dc:creator>Tomáš Grasl</dc:creator>
      <pubDate>Mon, 16 Feb 2026 07:55:39 +0000</pubDate>
      <link>https://dev.to/tom_grasl_798b66e85f7aa/i-built-an-mcp-server-for-openclaw-now-my-ais-talk-to-each-other-2can</link>
      <guid>https://dev.to/tom_grasl_798b66e85f7aa/i-built-an-mcp-server-for-openclaw-now-my-ais-talk-to-each-other-2can</guid>
      <description>&lt;p&gt;There's been a lot of hype around OpenClaw (a.k.a. Claw bot) lately. At first, it seemed like a fun toy to play with — nothing more. But over time, I realized it could actually be useful, so I installed it on my own physical server and started experimenting.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Problem with Chat Channels
&lt;/h2&gt;

&lt;p&gt;OpenClaw supports communication through Discord and Telegram, but honestly, that felt limiting. Chatting with an AI agent through a messaging app takes away a lot of its potential. So I asked myself: &lt;strong&gt;what if I built an MCP server and connected it directly to Claude?&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  So I Built It
&lt;/h2&gt;

&lt;p&gt;I created an MCP (Model Context Protocol) server for OpenClaw — &lt;a href="https://github.com/freema/openclaw-mcp" rel="noopener noreferrer"&gt;check it out on GitHub&lt;/a&gt; — and I have to say, watching one AI communicate with another AI is both hilarious and surprisingly effective. What started as a fun experiment turned out to be incredibly powerful.&lt;/p&gt;

&lt;h2&gt;
  
  
  Real-World Use Cases
&lt;/h2&gt;

&lt;p&gt;Here's where it gets interesting. Let me walk you through two workflows I'm actually using.&lt;/p&gt;

&lt;h3&gt;
  
  
  Auto-Fixing Bugs from Sentry
&lt;/h3&gt;

&lt;p&gt;Imagine you have an n8n automation that reacts to bugs coming from Sentry. Here's the flow:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A bug comes in from Sentry&lt;/li&gt;
&lt;li&gt;The n8n workflow picks it up and an AI node evaluates it — &lt;em&gt;yes, this should be fixed&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;Via MCP, it creates a prompt for OpenClaw&lt;/li&gt;
&lt;li&gt;Claw clones the project repository on the server&lt;/li&gt;
&lt;li&gt;It spins up Claude Code, which fixes the bug&lt;/li&gt;
&lt;li&gt;Claw creates a pull request on GitHub&lt;/li&gt;
&lt;li&gt;It reports back to the automation with the PR link&lt;/li&gt;
&lt;li&gt;You get a notification via Pushover or Slack&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;And yes, this actually works.&lt;/strong&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Jira Task Management with Claude Code
&lt;/h3&gt;

&lt;p&gt;Here's another one. You have the OpenClaw MCP connected to Claude Code on your localhost. You have a task in Jira, but you don't want to load Claude Code with a dozen different MCP servers — that eats up your context window. Instead, you only have the OpenClaw MCP.&lt;/p&gt;

&lt;p&gt;You ask Claw to look up the Jira task and send back what needs to be done. It returns the task as a ready-to-go prompt for Claude Code, and you can start working immediately. Once you're done, you notify Jira through Claw that the task is complete. Clean and simple.&lt;/p&gt;

&lt;h2&gt;
  
  
  A Word of Caution
&lt;/h2&gt;

&lt;p&gt;OpenClaw is powerful, but it can also be dangerous if you don't configure it properly. It's gone off the rails on me a few times already. Make sure you set clear boundaries and review what it's doing, especially when it has access to your repositories and external services.&lt;/p&gt;

&lt;p&gt;As an experiment and a productivity booster? Absolutely worth it. Just keep it on a leash. 🦞&lt;/p&gt;

</description>
      <category>mcp</category>
      <category>automation</category>
      <category>opensource</category>
      <category>ai</category>
    </item>
    <item>
      <title>I Built a Small Self-Hosted Runner for AI Coding Tasks</title>
      <dc:creator>Tomáš Grasl</dc:creator>
      <pubDate>Sun, 15 Feb 2026 16:42:35 +0000</pubDate>
      <link>https://dev.to/tom_grasl_798b66e85f7aa/i-built-a-small-self-hosted-runner-for-ai-coding-tasks-3a37</link>
      <guid>https://dev.to/tom_grasl_798b66e85f7aa/i-built-a-small-self-hosted-runner-for-ai-coding-tasks-3a37</guid>
      <description>&lt;p&gt;I built a small runner for self-hosted AI coding. It's similar to Claude Code but isolated in Docker and fully self-hosted.&lt;/p&gt;

&lt;p&gt;I actually like what Anthropic did with &lt;a href="https://claude.ai" rel="noopener noreferrer"&gt;Claude Code on the web&lt;/a&gt; — for small tasks it works fine. But I needed something I could hook up not just to GitHub but also to GitLab. So I said to myself: I'll just write it. And for this kind of thing, Go is the best fit.&lt;/p&gt;

&lt;h2&gt;
  
  
  What It Does
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/freema/codeforge" rel="noopener noreferrer"&gt;CodeForge&lt;/a&gt; receives task requests via REST API, clones your repository, runs an AI CLI tool (Claude Code) against it, streams progress via Redis Pub/Sub, and optionally creates pull requests. It supports multi-turn conversations, webhook callbacks, and workspace lifecycle management.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Client                  CodeForge                          AI CLI
  │                        │                                 │
  │  POST /api/v1/tasks    │                                 │
  ├───────────────────────▶│                                 │
  │         201 {id}       │                                 │
  │◀───────────────────────┤                                 │
  │                        │  git clone ──▶ run CLI          │
  │                        ├────────────────────────────────▶│
  │    Redis Pub/Sub       │         stream-json events      │
  │◀ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─┤◀────────────────────────────────┤
  │                        │                                 │
  │  GET /tasks/{id}       │           result + diff         │
  ├───────────────────────▶│◀────────────────────────────────┤
  │     200 {result}       │                                 │
  │◀───────────────────────┤                                 │
  │                        │                                 │
  │  POST /tasks/{id}/     │                                 │
  │       create-pr        │  git push ──▶ GitHub/GitLab API │
  ├───────────────────────▶├─────────────────────────────────▶
  │     200 {pr_url}       │                                 │
  │◀───────────────────────┤                                 │
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You POST a task with a repo URL and a prompt. CodeForge clones the repo, spawns the AI CLI, streams progress events through Redis Pub/Sub, and when it's done, you can tell it to create a pull request. That's it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Quick Start
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Start the server (requires Docker + docker-compose)&lt;/span&gt;
docker compose &lt;span class="nt"&gt;-f&lt;/span&gt; deployments/docker-compose.yaml &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-f&lt;/span&gt; deployments/docker-compose.dev.yaml up &lt;span class="nt"&gt;--build&lt;/span&gt;

&lt;span class="c"&gt;# Create a task&lt;/span&gt;
curl &lt;span class="nt"&gt;-X&lt;/span&gt; POST http://localhost:8080/api/v1/tasks &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"Authorization: Bearer dev-token"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"Content-Type: application/json"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="s1"&gt;'{
    "repo_url": "https://github.com/user/repo.git",
    "access_token": "ghp_your_token",
    "prompt": "Fix the failing tests in the auth module"
  }'&lt;/span&gt;

&lt;span class="c"&gt;# Check status&lt;/span&gt;
curl http://localhost:8080/api/v1/tasks/&lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="nb"&gt;id&lt;/span&gt;&lt;span class="o"&gt;}&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"Authorization: Bearer dev-token"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you have &lt;a href="https://taskfile.dev/" rel="noopener noreferrer"&gt;Task&lt;/a&gt; installed, just run &lt;code&gt;task dev&lt;/code&gt; instead of the docker compose command.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Go?
&lt;/h2&gt;

&lt;p&gt;I considered Node.js (my second language after PHP) and Rust, but Go hit the sweet spot:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Single binary deployment&lt;/strong&gt; — no runtime dependencies, no &lt;code&gt;node_modules&lt;/code&gt;, just copy and run&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Excellent concurrency&lt;/strong&gt; — goroutines handle multiple simultaneous tasks without callback gymnastics&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Low memory footprint&lt;/strong&gt; — the service itself uses ~15MB RAM, leaving resources for the actual AI CLI&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fast startup&lt;/strong&gt; — critical for containerized environments where pods come and go&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Battle-tested HTTP/Redis libraries&lt;/strong&gt; — &lt;code&gt;net/http&lt;/code&gt;, &lt;code&gt;go-redis&lt;/code&gt;, &lt;code&gt;chi&lt;/code&gt; router — mature, stable, boring (in a good way)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Real-World Use Cases
&lt;/h2&gt;

&lt;p&gt;Here are some ways I'm using CodeForge (or planning to):&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Automated issue fixing&lt;/strong&gt; — GitHub/GitLab webhook fires when an issue gets the &lt;code&gt;ai-fix&lt;/code&gt; label. A small middleware translates it into a CodeForge task. Minutes later, a PR appears.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Nightly code maintenance&lt;/strong&gt; — Cron job that runs prompts like "update deprecated API calls" or "improve error messages" across multiple repos.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Non-developer requests&lt;/strong&gt; — Product managers submit tasks through a simple form: "Add a loading spinner to the dashboard page." CodeForge handles the rest.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Code review bot&lt;/strong&gt; — After a human creates a PR, trigger CodeForge to review it with a separate AI instance and post comments.&lt;/p&gt;

&lt;h2&gt;
  
  
  What's on the Roadmap
&lt;/h2&gt;

&lt;p&gt;CodeForge currently uses Claude Code as its AI backend, but the architecture is designed to be CLI-agnostic. Here's what's coming:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Multi-CLI support&lt;/strong&gt; — runners for OpenCode, Codex, and other AI coding CLIs&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Task sessions&lt;/strong&gt; — cross-task memory so the AI remembers context from previous tasks on the same repo&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Automated code review&lt;/strong&gt; — a second AI pass reviews changes before creating a PR&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Subscription &amp;amp; multi-user auth&lt;/strong&gt; — per-user API keys and usage tracking&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Try It Out
&lt;/h2&gt;

&lt;p&gt;The repo is at &lt;a href="https://github.com/freema/codeforge" rel="noopener noreferrer"&gt;github.com/freema/codeforge&lt;/a&gt;. I'd love feedback — especially on the API design and the multi-CLI architecture that's coming next.&lt;/p&gt;

&lt;p&gt;If you have questions or ideas, open an issue or find me on &lt;a href="https://github.com/freema" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>automation</category>
      <category>go</category>
      <category>showdev</category>
    </item>
  </channel>
</rss>
