I didn’t go into the MeDo hackathon with some big, polished idea. I just wanted to build something I’d actually use.
So I made Exam AI.
The problem is simple: studying for exams is chaotic. You read notes, search things, forget half of it, and then try to cram everything at the end. I wanted something that helps you actively think, not just passively read.
How it actually works ⚙️
You give Exam AI a topic — anything you’re studying.
From there, it:
- generates exam-style questions 🧠
- lets you try to answer them yourself ✍️
- then gives you explanations, not just the “correct answer” 💡
So instead of rereading notes, you’re constantly testing yourself and filling in gaps as you go.
The useful part is that it adapts to how you interact with it. If something isn’t clear, you can push further, ask again, or go deeper into a specific concept. It’s less like a static quiz and more like a back-and-forth.
What surprised me 😄
The biggest surprise wasn’t even the idea — it was how quickly it came together. I didn’t get stuck in setup or overengineering. I could just iterate and focus on whether it’s actually helpful.
That said, it wasn’t effortless:
- the quality depends a lot on how you phrase things 🤔
- it’s easy to overbuild instead of keeping it simple
- making explanations actually useful is harder than it sounds
I had to keep asking: would this help someone who’s stressed before an exam?
What’s next 🔮
If I keep working on it, I’d make it more personal — adapting to what you’re weak at instead of treating every topic the same.
Right now it’s simple, but that’s kind of the point.
It’s a small tool, but it solves a real problem: turning studying into something active instead of passive.
Top comments (0)