Building a Dedicated AI Assistant Environment with a Single Mini PC
When starting a new company, I hit a contradiction: "Don't want to install random software on the work laptop" versus "Want to use an AI assistant." The fix: get one Mini PC and put everything on it.
Background
Company security policy prohibits installing extra software on the work Mac. But I needed an AI assistant (OpenClaw) for daily work. Since I work from home and everything stays on the local network, I decided to add a Mini PC (GMK) as a dedicated environment.
What I needed:
- File sharing (exchange documents with the work Mac)
- Chat UI (an interface to talk to the AI assistant)
- The AI agent itself
Architecture
Work Mac (Finder / Browser)
│
├── smb://192.168.x.x/share → Samba (file sharing)
└── http://192.168.x.x:8065 → Mattermost (chat UI)
↕
OpenClaw Agent ("Joy")
Everything runs on the Mini PC (Ubuntu 24.04, 16GB RAM). LLM inference happens in the cloud (Anthropic API), so local specs can be minimal.
Step 1: Samba — File Sharing
A simple file share accessible from Mac's Finder via Cmd+K → smb://192.168.x.x/share:
sudo apt-get install samba
sudo mkdir -p /home/openclaw/mdk-share
Add a share section to /etc/samba/smb.conf:
[mdk]
path = /home/openclaw/mdk-share
browseable = yes
read only = no
valid users = openclaw
sudo smbpasswd -a openclaw
sudo systemctl restart smbd
That's it. Drag and drop files from the Mac.
Step 2: Mattermost — Chat UI (Docker)
Mattermost serves as the chat interface to the AI assistant. It's a Slack-like tool that you can self-host — that's the key.
# docker-compose.yml
services:
db:
image: postgres:15-alpine
environment:
POSTGRES_DB: mattermost
POSTGRES_USER: mmuser
POSTGRES_PASSWORD: <your-password>
volumes:
- pgdata:/var/lib/postgresql/data
mattermost:
image: mattermost/mattermost-team-edition:latest
depends_on:
- db
environment:
MM_SQLSETTINGS_DRIVERNAME: postgres
MM_SQLSETTINGS_DATASOURCE: "postgres://mmuser:<password>@db:5432/mattermost?sslmode=disable"
MM_SERVICESETTINGS_SITEURL: "http://192.168.x.x:8065"
ports:
- "8065:8065"
volumes:
- mmdata:/mattermost/data
volumes:
pgdata:
mmdata:
docker compose up -d
Open http://192.168.x.x:8065 in a browser, complete the initial setup, create an admin account, then create a Bot Account (@assistant) and save the token.
Step 3: OpenClaw — The AI Agent
Install OpenClaw and connect it to the Mattermost bot:
npm install -g openclaw
openclaw init
openclaw plugins install @openclaw/mattermost
Define the Mattermost connection in openclaw.yaml:
channels:
mattermost:
baseUrl: http://localhost:8065 # ← "baseUrl", NOT "url"!
botToken: <bot-token>
dmPolicy: open
Gotcha: The config key is baseUrl, not url. The docs and actual behavior don't quite match — cost me 30 minutes.
Write the agent's identity file (SOUL.md) and start with openclaw gateway start. When "connected as @assistant ✅" appears in Mattermost, you're done.
Step 4: Giving the Agent an Identity
OpenClaw agents define "who they are" through a SOUL.md file. I set this one up as a company tech assistant called "Joy":
- Expertise: Cloud architecture (AWS/GCP/Azure), data engineering, IaC
- Personality: Professional, results-driven, concise
- Model: Claude Sonnet
Type @joy hello in Mattermost and it responds in character.
Time and Cost
- Build time: About 1.5 hours (Samba 10 min + Mattermost 30 min + OpenClaw 30 min + debugging 20 min)
- Hardware: GMK Mini PC (~$200, repurposed from existing inventory)
- Running cost: LLM API fees only (electricity is a few dollars/month)
Lessons Learned
- UI choice matters. Only developers can work in a terminal. Adding Mattermost lets non-engineers use the AI assistant through chat.
- Self-hosting feels secure. Company data doesn't flow to SaaS. The LLM API calls do go out, but chat history and files stay local.
- A Mini PC is enough. LLM inference is in the cloud, so local hardware just relays API calls and serves the chat UI. 16GB RAM is plenty.
-
baseUrlis noturl. Trust the docs, but read the source when things don't work.
Conclusion
The need was "use an AI assistant without polluting the work laptop." The solution: Mini PC + Samba + Mattermost + OpenClaw. Built in 1.5 hours, near-zero additional cost (repurposed hardware).
This setup works because it's remote work on a local network. Down the road, VPN + reverse proxy could enable remote access.
Top comments (0)