This is was originally meant to be a submission for the DEV Weekend Challenge: Community But this is my first Dev post and also was my first coding challenge I tried enter and I missed the deadline by 30 minutes because I thought it was EST lol, I learnt the hard way to pay attention deadline times lol I gotta laugh at my dumb mistakes lol.
The Community
This project is built for hobbyists, students, and anyone on a tight budget who wants to experiment with AI chatbots. It’s beginner-friendly and designed for people who want to test different AI models for free, explore prompts, tweak parameters, and track token usage—all without setting up complex frameworks.
What I Built
I built a local-first AI chat UI that works with OpenRouter and Venice AI. Users can:
- Add, edit, and toggle system prompts.
- Keep conversation history stored locally.
- Track tokens and context window usage.
- Switch themes (light/dark) with a sliding drawer UI.
- Adjust generation parameters like temperature and max tokens.
The interface is intentionally simple—no frontend frameworks, no build steps, just a single HTML file for the frontend and a FastAPI backend (main.py) for handling API requests.
Demo
Here’s a link to the repo and live demo instructions:
- GitHub Repo: Local AI Chat Studio
- Run locally by following the Installation & Running the Code instructions in the README.md.
Note: You’ll need an OpenRouter API key (Sign up here) and/or a Venice AI API key (Sign up here) to use the chatbot.
Code
You can view, fork or git clone the project directly from GitHub: here is the command I used to git clone the project and run the code after I cloned it to test it. It worked and the code run without any errors when I tested it.
Git clone it using this command
git clone https://github.com/emmyacuff9-sys/Local-AI-Chat-Studio.git
After you clone the repo run this command to run the code.
python3 main.py
Local AI Chat Studio Repository
The repo includes:
-
index.html– Chat UI, prompts, token tracking, theme toggle -
main.py– FastAPI backend to proxy API requests and handle chat -
requirements.txt– Dependencies for running the backend
How I Built It
- Frontend: Pure HTML, CSS (with variables), minimal JavaScript (all in one file for simplicity)
- Backend: FastAPI for routing, CORS handling, and proxying API calls to OpenRouter and Venice AI
- APIs: OpenRouter and Venice AI for AI completions
- Extras: LocalStorage for saving chat history, system prompts, and API keys
I used AI (Gemini) to help write and debug this projects code, which helped me focus on learning prompt engineering and improving AI-assisted development.
Why This Project
I wanted a lightweight, beginner-friendly way to experiment with AI chatbots without getting overwhelmed by thousands of lines of code. Beginners can run it locally, tweak parameters, add prompts, and learn about token usage and multi-provider AI without spending money. Advanced coders can extend the project easily.
Top comments (0)