GitHub: https://github.com/YOUR_USERNAME/chatgpt-custom-mcp-for-local-files
Stop uploading files to ChatGPT. This MCP server lets ChatGPT read files directly from your machine via Cloudflare Tunnel.
What It Does
- ChatGPT can list, search, and read files from a folder on your computer
- Files stay local - fetched on-demand, not uploaded
- Always current - no stale copies
- Complete file access for complete context- not RAG chunks
How It's Different
vs Manual Upload:
- Files fetched on-demand, not stored in ChatGPT Projects
- Automatic updates when files change (with fswatch from work folder to server folder)
- No size limits or re-uploads
vs RAG/Vector Search:
- Complete files, not chunks
- Direct file system access
- Slightly longer search times, but, context is still king.
- Simple tracking of tool calls, can see exactly which files were read.
Why I Built This
Tired of:
- Re-uploading files every time they change
- Copy-pasting code snippets constantly
- ChatGPT working with outdated versions
- ChatGPT doesn't have full files context.
Now ChatGPT queries my local folder directly. When code updates, ChatGPT sees it immediately.
Tech Stack
- Python + FastAPI (MCP server)
- OAuth 2.0 with dynamic client registration
- Cloudflare Tunnel (free tier)
- systemd for background services
Setup Time
~30 minutes. You need:
- Python 3.8+
- A domain (managed by Cloudflare)
- ChatGPT Plus/Pro
Full docs in the repo: installation, troubleshooting, security guidelines.
Example Usage
"List all Python files in my project"
"Read auth.py and check for security issues"
"Search for files containing 'database connection'"
ChatGPT explores your codebase like a developer would.
Note on Data
File contents are sent to OpenAI for processing (same as manual upload). Difference: files are fetched on-demand, not pre-uploaded or stored in Projects.
Open source (MIT). No support provided - it's a side project, but docs are comprehensive.
Tags: #chatgpt #mcp #python #ai #devtools
Top comments (0)