Most MCP servers I've looked at assume an LLM is somewhere in the pipeline, for entity extraction, embedding, or ranking. Some of them call an external API just to decide which node to fetch. I get why: it makes the server feel smarter.
I was trying to solve a different problem.
I spent a while trying to give Claude Desktop a reliable memory. Not conversation summaries. A way to ask whether a specific fact is in my data and get a binary answer back.
I tried semantic search, vector similarity, RAG. They find related content well. But "find me something similar to this claim" is not the same as "is this claim supported by what I ingested." I kept getting plausible results that weren't accurate. An 87% confidence score doesn't tell you whether Alice has a PhD.
So I gave up on retrieval for this problem. Not retrieval — grounding.
What Kremis does
Kremis is a graph store I wrote in Rust. You feed it entity-attribute-value triples. It builds a deterministic graph. When you query it, you get back what's in the graph. Same input, same output, every time.
The MCP bridge (kremis-mcp) is a stdio process that proxies HTTP to a running Kremis server. It doesn't call any external API. It translates MCP tool calls into HTTP requests:
{
"mcpServers": {
"kremis": {
"command": "/path/to/kremis-mcp",
"env": {
"KREMIS_URL": "http://localhost:8080"
}
}
}
}
Nine tools: ingest, lookup, traverse, path, intersect, status, properties, retract, hash. Claude queries the graph directly instead of generating an answer from memory.
One binary for the server, one for the bridge. No Python runtime, no embedding model, no API key for the memory layer.
Tradeoff
Kremis doesn't extract entities automatically. You ingest structured triples yourself, or write something that does it. That's deliberate. Adding LLM-based extraction to the write path reintroduces the probabilistic layer you removed from the read path.
For automatic extraction from unstructured text, use a RAG system. Kremis is for when you know the structure of your data and want an auditable memory layer.
Try it
git clone https://github.com/TyKolt/kremis.git
cd kremis
cargo build --release
Docker:
docker build -t kremis .
docker run -i --rm -p 8080:8080 kremis
Repo: github.com/TyKolt/kremis
Docs: kremis.mintlify.app
Alpha. v0.18.1, breaking changes expected before v1.0.
Drafted with AI assistance, edited by hand. The technical claims are mine and verifiable in the repo.
Top comments (0)