This is a submission for the DEV Weekend Challenge: Community
The Community
This project was built to help the Muslim dev community build faster and more efficient AI powered applications
What I Built
When I was browsing, I found an open source project that caught my eye, it was a Qur'an search engine built with TypeScript and published as an NPM package, so I thought to myself to use this to solve a very common problem.
π΅βπ« Problem: whenever AI agents are asked religious questions or questions related to the Qur'an (the holy book of Islam) it would either give out misinformation, which is problematic especially when the question is related to the person's religion and religious practices, or in the better case, it would have to search the web, which spends more tokens and is slower, so I thought I will take the exiting engine and build an MCP adaptation layer on top of it.
β Result: When prompted, the AI agent will use the search tool to search locally using Arabic optimized search engine, which will give more accurate information as well as removing the need for online searches and optimize token usage and response speed.
Demo
Code
How I Built It
Building the Quran Search MCP was an exercise in transforming a high-performance TypeScript library into a standardized AI tool. Here is the concise breakdown of the development process:
Architectural Foundation:
I adopted a pnpm Monorepo structure to decouple the core search logic from the MCP communication layer. This ensured that the quran-search-engine remained a pure dependency, while the mcp-server acted as the specialized interface for LLMs.High-Performance Data Bootstrapping:
To eliminate search latency, I implemented a Singleton Data Cache. Using Promise.all(), the server concurrently loads the Quranic text, morphological maps, and word indexes into memory during the initial handshake. This ensures sub-millisecond response times for the AI.Protocol Integration:
Using the official MCP SDK, I mapped the engineβs search capabilities to a structured toolset. I utilized Zod schemas to strictly define search parameters (fuzzy, exact, prefix), ensuring the AI assistant provides valid, typed inputs that the engine can process without errors.Dual-Purpose Distribution:
I configured the package for Dual-Mode Usage:
CLI: Optimized for immediate use via npx for AI Desktop clients.
Library: Used Subpath Exports in package.json so developers can import the engine or the server independently into their own TypeScript projects.Validation & Stress Testing:
I bypassed the GUI and tested the integration using JSON-RPC piping. By feeding raw JSON strings directly into the server via the terminal, I verified protocol compliance and error handling before the first deployment.
Top comments (0)