🚀 The Problem
If you’ve used AI tools like ChatGPT, you’ve probably noticed something frustrating:
Sometimes you get amazing results…
and sometimes the output is completely off.
I faced the same issue. At first, I thought it was the AI.
But after experimenting more, I realized something important:
👉 The quality of output depends heavily on how you write your prompts.
💡 The Idea
Writing good prompts consistently is hard.
So I thought:
What if I could build a tool that improves prompts automatically?
That’s how Promet was born.
🛠️ What is Promet?
Promet is an open-source AI prompt optimizer.
You give it a rough idea, and it transforms it into a clear, structured, and effective prompt.
⚙️ Key Features
- Multiple modes (Quick, Balanced, Auto, Expert)
- Domain-specific tuning (tech, marketing, writing, etc.)
- Real-time streaming responses
- Iterative prompt refinement (conversation-style)
- Prompt history and organization
🧠 How It Works
Promet uses LLMs (Llama 3 via Groq API) to analyze and refine your input.
It:
- Detects prompt complexity
- Enhances structure
- Adds missing context
- Improves clarity
All responses are streamed in real time for a smooth experience.
🧰 Tech Stack
- Frontend: React + Vite
- Backend: Node.js + Express
- Database: MongoDB
- AI Engine: Groq API (Llama 3 models)
🔗 Try It Out
🌐 Live: https://promet.indevs.in/
💻 GitHub: https://github.com/Tripadh/Promet
🙌 Feedback
I’m still improving this project, so I’d love your feedback:
- Do you think prompt optimization tools are useful?
- What features would you like to see next?
⭐ Final Thoughts
AI is powerful, but prompt quality is often overlooked.
Promet is my attempt to make prompt engineering easier and more accessible.
If you find it useful, feel free to star the repo or contribute!
Top comments (0)