How I built a Model Context Protocol server that lets AI agents search codebases semantically using Seroost
Why I Built This
I’ll be honest: I wasn’t a fan of “agentic coding” at first. The idea of letting AI agents run around my codebase sounded messy. That changed after I watched a NetworkChuck video on making MCP servers. It clicked — if agents could use my tools safely, I could actually make them useful.
That’s where Seroost comes in. I had already built Seroost, a semantic code search engine in Rust that indexes and searches documents using TF-IDF. It’s fast, snippet-aware, and handles large directories well.
The missing piece? Connecting it to AI agents. That’s where the Model Context Protocol (MCP) comes in.
The Problem with AI Agents Today
Working with AI assistants like Claude or GitHub Copilot on big projects often feels like this:
- You ask about “authentication logic”
- It says “show me your files”
- You copy-paste a dozen files
- It still misses context
Most AI agents either rely on keyword search (grep
) or shallow semantic search. They don’t really “understand” the structure of your project.
Enter MCP + Seroost
MCP acts like a universal adapter for AI tools. Think of it as the USB port for AI: one standard protocol that lets any agent plug into external tools.
By turning Seroost into an MCP server, I gave AI agents the ability to search codebases semantically, return ranked results with snippets and line numbers, and work without constant manual uploads.
Building the MCP Server
I kept the setup lean, focusing on three tools:
- Set Index Path → Tell Seroost which directory to target
- Build Index → Run the indexing process
- Search → Query the indexed code with semantic/fuzzy matching
The server wraps Seroost’s CLI with Node’s child_process.spawn
, parses the JSON results, and exposes them over MCP.
server.tool("seroost_search", "Search codebase semantically", {
query: z.string().describe("Search term or natural language description"),
}, async ({ query }) => {
const content = await runSearch(query);
return { content: [{ type: "text", text: JSON.stringify(content) }] };
});
Simple, but powerful.
What Changes for AI Agents
Before
AI: Show me your authentication files.
Me: *uploads 15 files*
AI: Maybe this is it... but I might be missing context.
After
AI: Let me search your project for authentication logic.
→ seroost_search("user authentication functions")
Results:
- /src/auth/authenticate.js: function authenticateUser(credentials)
- /src/middleware/auth.js: const verifyAuthToken = (token) => { ... }
No more blind guessing. The agent can scan the entire project, find the right files, and jump directly to relevant lines.
Why This Matters
- Speed: No manual uploads, instant discovery
- Scale: Works across thousands of files
- Semantics: Finds patterns (“error handling middleware”) not just keywords
- Composability: Works with any MCP-enabled agent, not just one platform
Setup
- Install Seroost (
cargo build --release
) - Clone the MCP server and run
npm install
- Add it to your AI client config:
{
"mcpServers": {
"seroost-search": {
"command": "node",
"args": ["/path/to/search-mcp/build/index.js"]
}
}
}
- Run
seroost_set_index
, thenseroost_index
, then search away.
What’s Next
This is just the start. MCP makes it possible to expose any tool to AI agents:
- Query databases
- Analyze logs
- Deploy cloud resources
- Run tests
I plan to keep expanding beyond code search — but for now, Seroost shows what’s possible when you give AI a real toolbelt.
Final Thoughts
Building this taught me a few things:
- MCP is still new, but incredibly promising
- Simple tools become powerful when standardized
- Semantic search makes AI agents much more usable
- I should’ve jumped on the “agentic coding” train earlier
You can try the full code here: semantic-search-mcp.
What tool would you plug into your AI agent if you could?
Tags: #ai #mcp #rustlang #tooling #typescript #copilot #vscode
Top comments (0)