Go-Mirofish Fast local Go AI swarm engine – predict anything with agents
I needed a fast offline tool to test reactions to documents. I upload an earnings report or trading thesis or product announcement. I want to see how hundreds of different AI agents respond in a social simulation. No data leaves my machine. No slow cloud delays.So I built go-mirofish.You feed the tool any document. It builds a knowledge graph. It creates hundreds of AI agents. Each agent has its own personality. The agents run a full social simulation. You get a prediction report. You can chat with any agent to ask more questions.I rewrote the control plane in pure Go. Python no longer runs the hot path. The system now delivers sub 2 ms p50 latency. It handles 198 requests per second with zero errors. It runs on a standard laptop. It also runs on a Raspberry Pi 5.
- Here is how you start in minutes: https://go-mirofish.vercel.app/docs
Home / entry
Simulation run
Report generation
Report timeline / tools
Simulation history
Deep interaction
Split: graph, workbench & system terminal
Graph view & node details
Thanks to these projects. MiroFish created the original swarm engine. MiroFish-Offline added the local English version. OASIS powers the multi-agent simulation. Neo4j manages the graph memory. Ollama runs the local models.I built go-mirofish for real work. Traders test market reactions and liquidation forecasts. Product teams run PR war rooms.
Teams practice cyber drills.
Writers explore alternate story endings.
Try the demo here.
https://gomirofish.vercel.app
See the full source here.
https://github.com/go-mirofish/go-mirofishWhat
document will you test first. Tell me in the comments.
Top comments (0)