DEV Community

Cover image for I got tired of 2GB Docker images for simple AI chats, so I built a single-file PHP interface.
It's Meee!
It's Meee!

Posted on

I got tired of 2GB Docker images for simple AI chats, so I built a single-file PHP interface.

Hi everyone,
I’ve been getting into local AI (Ollama) recently, but I was frustrated that every UI suggestion seems to require a complex stack (Node.js, React, Docker, 50 containers). I just wanted something I could drop onto my existing cheap shared hosting or a Raspberry Pi without setting up a build pipeline.
So I built Single-File PHP AI.
D
It’s exactly what it sounds like: One index.php file.
No Database: Saves chat history to your browser's LocalStorage.
No Build: Just git clone or upload the file.
Streaming: Uses Server-Sent Events (SSE) so the text types out in real-time (no buffering).
Backend: Supports both Ollama (Local) and OpenAI (API).
I know PHP isn't the "coolest" language right now, but for "drop-in and works forever" utility, it's hard to beat.
It’s MIT licensed. I’d love to hear if this is useful to anyone else who prefers lightweight setups.

Repo here: https://github.com/mariorazo97/single-file-php-ai

Top comments (0)