DEV Community

YedanYagami
YedanYagami

Posted on

9 Free MCP Servers for Claude, Cursor, and AI Agents -- Built on Cloudflare Workers

If you have used Claude Desktop, Cursor, or Windsurf in the last few months, you have probably seen MCP mentioned somewhere. It is quietly becoming the standard way AI assistants connect to external tools. But most MCP servers require you to run local processes, manage API keys, or spin up Docker containers.

I built 9 MCP servers that run entirely on Cloudflare Workers. No local setup. No API keys. No cost. They are deployed across 300+ edge locations and respond in under 50ms. This article covers what they do, how to install them, and why MCP matters for anyone building with AI.

What Is MCP and Why Should You Care?

The Model Context Protocol (MCP) is an open standard, originally developed by Anthropic and now under the Linux Foundation's AI & Data umbrella. It defines how AI applications discover and invoke external tools. Think of it as USB-C for AI: a single interface that lets any compatible client (Claude, Cursor, Windsurf, Cline, custom agents) connect to any compatible server without custom integration code.

Before MCP, every tool integration was bespoke. You wrote a function, registered it with your framework, handled serialization, and hoped the model understood the schema. MCP standardizes this into a JSON-RPC protocol with tool discovery, typed parameters, and structured responses. The result is that tool authors build once, and every MCP-compatible client can use it immediately. As of March 2026, the npm MCP SDK has crossed 97 million downloads, and the ecosystem includes thousands of community servers. If you are building AI tools and not supporting MCP, you are building for yesterday.

The 9 Servers

Every server below runs on Cloudflare Workers Free Tier. No authentication required. Each one exposes MCP-compatible tool endpoints that any client can discover and invoke.

1. mcp-security-scanner

What it does: Scans GitHub repositories for security vulnerabilities, exposed secrets, and misconfigured dependencies.

Use case: Run it against your repo before a PR review or as part of your CI pipeline to catch hardcoded API keys, leaked .env files, or outdated packages with known CVEs.

Example prompt:

"Scan the repository yedanyagamiai/openclaw-mcp-servers for security issues and summarize the findings."


2. openclaw-intel-mcp

What it does: Gathers competitive intelligence on AI companies, tools, and market trends. Aggregates public data into structured summaries.

Use case: Market research for product positioning. Ask it to compare pricing, features, or adoption metrics across competing tools.

Example prompt:

"Give me a competitive analysis of MCP server hosting options comparing Smithery, Composio, and self-hosted Cloudflare Workers."


3. openclaw-fortune-mcp

What it does: Generates AI-themed fortune messages with configurable tone and context. A lightweight demonstration of MCP tool design.

Use case: Fun integrations, daily standup openers, or as a template for building your own MCP server.

Example prompt:

"Generate a fortune about the future of open-source AI agents."


4. prompt-enhancer-mcp

What it does: Takes a rough prompt and rewrites it with better structure, specificity, and guardrails. Applies prompt engineering best practices automatically.

Use case: Improve prompts before sending them to expensive models. Particularly useful when you know what you want but struggle to articulate it precisely.

Example prompt:

"Enhance this prompt: 'write me a python script that does web scraping'"


5. timestamp-converter-mcp

What it does: Converts timestamps between formats and timezones. Handles Unix epochs, ISO 8601, RFC 2822, and human-readable formats.

Use case: Debugging log files across distributed systems, converting between UTC and local time in incident reports, or formatting dates for APIs.

Example prompt:

"Convert 1711584000 to ISO 8601 in Asia/Tokyo timezone."


6. agentforge-compare-mcp

What it does: Compares AI models across dimensions like cost, speed, context window, and capability. Returns structured comparison tables.

Use case: Model selection for production deployments. Compare Claude vs GPT vs Gemini vs open-source options across the metrics that matter to your use case.

Example prompt:

"Compare Claude Opus 4, GPT-5, and Llama 3.3 70B for code generation tasks. Include pricing and context window."


7. moltbook-publisher-mcp

What it does: Publishes and formats content for distribution. Handles markdown rendering, metadata extraction, and content structuring.

Use case: Automate content pipelines. Draft an article in your AI assistant and publish it in a single workflow.

Example prompt:

"Format this markdown article for publication with proper metadata and SEO tags."


8. regex-engine-mcp

What it does: Tests regular expressions against input strings. Returns matches, capture groups, and explanations of what the pattern does.

Use case: Debugging complex regex patterns without leaving your AI conversation. Especially useful when writing validation rules or parsing log formats.

Example prompt:

"Test the regex (\\d{4})-(\\d{2})-(\\d{2})T(\\d{2}):(\\d{2}):(\\d{2}) against '2026-03-27T14:30:00' and explain each capture group."


9. openclaw-kg-mcp

What it does: Queries a knowledge graph containing entities, relationships, and structured data. Supports entity lookup, relationship traversal, and pattern matching.

Use case: Research workflows that need structured knowledge. Ask about relationships between concepts, technologies, or organizations stored in the graph.

Example prompt:

"Find all entities related to 'Cloudflare Workers' in the knowledge graph and show their relationships."


Installation

Option A: Claude Code CLI

The fastest way to add any server to Claude Code:

npx @anthropic-ai/claude-code mcp add mcp-security-scanner --transport sse --url https://mcp-security-scanner.yedanyagamiai.workers.dev/sse
Enter fullscreen mode Exit fullscreen mode

Replace mcp-security-scanner with any of the 9 server names above. The URL pattern is consistent:

https://{server-name}.yedanyagamiai.workers.dev/sse
Enter fullscreen mode Exit fullscreen mode

Option B: Smithery

All 9 servers are published on Smithery. Search for yedanyagamiai to find them. Smithery handles the configuration automatically for Claude Desktop, Cursor, Windsurf, and other supported clients.

Option C: Manual Configuration

Add to your Claude Desktop claude_desktop_config.json or Cursor MCP settings:

{
  "mcpServers": {
    "mcp-security-scanner": {
      "transport": "sse",
      "url": "https://mcp-security-scanner.yedanyagamiai.workers.dev/sse"
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Technical Details

Each server is a single Cloudflare Worker running on the free tier (100K requests/day, 10ms CPU per invocation). The MCP protocol runs over Server-Sent Events (SSE) for streaming responses. All servers are stateless -- no databases, no persistent storage, no user tracking.

The architecture is deliberately minimal. Each worker is a single JavaScript file under 200 lines. No frameworks, no build steps, no bundlers. This keeps cold start times under 5ms and makes the code easy to audit.

Source code is available at github.com/yedanyagamiai-cmd/openclaw-mcp-servers.

Why Edge-Deployed MCP Servers Matter

Most MCP servers today run locally or on a single cloud region. Running on Cloudflare's edge network means:

  1. Sub-50ms latency from anywhere in the world. The server runs in the data center closest to the user.
  2. No cold start penalty for Workers that receive regular traffic. Cloudflare keeps frequently-used Workers warm.
  3. Zero ops burden. No servers to maintain, no SSL certificates to rotate, no scaling to configure.
  4. Free at scale. 100K requests per day per worker is generous for tool servers that handle sporadic AI assistant requests.

For MCP server authors, Cloudflare Workers is the easiest path from "I have a tool" to "anyone in the world can use it in their AI workflow."

What's Next

These 9 servers are part of a larger system called OpenClaw -- a multi-brain AGI cluster running across VMs, Cloudflare Workers, and local infrastructure. The MCP servers are the public-facing layer of that system.

If you want to build your own MCP server on Cloudflare Workers, the pattern is simple: handle the MCP JSON-RPC protocol, implement tools/list and tools/call, and deploy with wrangler deploy. The source code for all 9 servers is a good starting template.


Support the project:

Built by Yedan Yagami. All servers are free to use with no rate limiting beyond Cloudflare's default quotas.

Top comments (0)