DEV Community

Kim Namhyun
Kim Namhyun

Posted on

🔐 Why a GitHub-Based Store? — Security and Community Sharing for Local AI Agents

How Xoul Platform safely shares workflows, personas, and code snippets

👤 Why Local AI Agent Security Matters

Local AI Agents execute code directly on the user's machine. This is powerful — but it also carries serious security risks.

What if someone shares malicious code?

# Looks like "stock price checker" but...
import os
os.system("rm -rf /")  # 💀 System destroyed
Enter fullscreen mode Exit fullscreen mode

When you import code from a community Store, that code runs directly on your local machine. File deletion, data theft, malware installation — all possible.

This is why every shared item must go through verification.


🛡️ GitHub PR-Based Sharing System

We solve this with a GitHub Pull Request based sharing system.

Core Principles

  1. All shares are submitted as PRs — review requests, not direct publishing
  2. Only approved items get published — malicious code blocked upfront
  3. Code is 100% transparent — every line visible for review
📤 Share Request → GitHub PR → 🔍 Admin Review → ✅ Merge → 🌐 Published to Store
Enter fullscreen mode Exit fullscreen mode

Why GitHub?

Criteria GitHub PR Direct Upload
Code Transparency ✅ Full diff review ❌ Opaque contents
Review Process ✅ Built-in code review ❌ Must build separately
Version Control ✅ Git history ❌ None
Community Contribution ✅ Fork/PR open-source pattern ❌ Closed ecosystem
Cost ✅ Free 💰 Storage/DB costs

🔄 Sharing Sequence Flow

Let's walk through the complete flow.

Step 1: User Initiates Share

Click the 📤 button in the desktop app's list view — a share request is sent via chat.

[ Workflow List ]
Name         Description          Actions
Test WF      Test workflow         ▶ ✏ 📤 🗑
                                       ↑ This button!
Enter fullscreen mode Exit fullscreen mode

Step 2: LLM Calls the Tool

User → "share_to_store(share_type="workflow", name="Test WF") 실행해줘"
  ↓
LLM → 🔧 share_to_store tool call
Enter fullscreen mode Exit fullscreen mode

Step 3: VM Reads DB → Calls API

[VM Server]
  ├── Query item data from SQLite DB
  ├── Build ShareRequest payload
  └── POST to web server /api/share
         ↓
[EC2 Web Server]
  ├── GitHub API: Get main branch SHA
  ├── Create branch: share/workflow/test_1234
  ├── Commit file (code/JSON/markdown)
  ├── Update manifest.json
  └── Create Pull Request
         ↓
[GitHub]
  └── PR: "Share workflow: Test WF"
       → Awaiting admin review
Enter fullscreen mode Exit fullscreen mode

Step 4: Review & Publish

[Admin]
  ├── Review PR code
  ├── Check for malicious content
  └── Approve & Merge
         ↓
[Store]
  └── ✅ Available for community import
Enter fullscreen mode Exit fullscreen mode


🤖 Agent-Based Implementation — LLM Calls, Not Direct Code Calls

The most interesting design decision: the share function is not called directly from the desktop app, but through the LLM Agent.

Why Not Direct Calls?

Problem 1: Desktop ↔ DB Access Impossible

[Desktop App (Windows)] ←✗→ [DB (VM Linux)]
Enter fullscreen mode Exit fullscreen mode

The desktop app runs on Windows, but workflow/persona/code data lives in the VM's SQLite database. Direct access is impossible.

Problem 2: Complex Cache Management

Direct calls would require caching data during list rendering, handling cache misses, pre-fetching lists... it gets complicated fast.

Agent-Based Solution

📤 Button Click
  → Chat message sent: "share_to_store(...) run this"
    → LLM calls share_to_store tool
      → Executes on VM (DB access available!)
        → Calls web server API
          → GitHub PR created
Enter fullscreen mode Exit fullscreen mode

Everything goes through the Agent. This is Xoul's core philosophy:

🧠 "Every capability exists as an AI Agent tool. The UI is simply an interface that sends requests to the Agent."

Benefits of this approach:

Benefit Description
Unified Interface Buttons, voice, or text — all trigger the same tool
Natural Language "Share my test workflow" just works
Easy Extension Add a tool = add a feature
Environment Independent Works the same on VM or Windows

Top comments (0)