This is a submission for the GitHub Copilot CLI Challenge
Echoflow - Active Recall Powered by AI
Taking notes is good. Being tested on them intelligently is even better.
A Personal Introduction (first post)
Hi ! I'm Adam, a young self-taught student who has spent months deliberately avoiding AI coding tools.
Why ? Because I wanted to "earn" my skills first to understand what I was building without shortcuts, to struggle through bugs, and to truly grasp the fundamentals. I was convinced that using AI would somehow make me... less of a developer.
But then this GitHub community challenge came along, and I thought :
"Maybe I just needed an excuse π
"
So here we are. Echoflow is my first real project built almost entirely with GitHub Copilot CLI, and honestly ? It's been transformative. Not because it wrote perfect code, but because it let me stay in the flowβthinking, iterating, and building without constantly context-switching.
What is Echoflow?
Echoflow is an AI-powered Active Recall tool designed to solve a problem I face constantly :
I write tons of notes... and i read them.. That's it.
The idea is simple : instead of passively rereading notes, Echoflow uses AI to quiz you on what you've written. It's like having a study buddy who remembers everything you learned and asks you the right questions at the right time.
Core Features
- π Markdown-powered note-taking (because plain text is so 2010)
- π€ AI-generated questions based on your notes (powered by OpenRouter)
- π¬ Conversational quizzing - chat with AI to deepen understanding
- π Progress tracking - AI analyzes your strengths and weaknesses (not perfectly implemented yet)
- π¨ Dark mode UI - because your eyes deserve better
- π Privacy-first - your notes stay yours (Supabase RLS)
Here's a demo
The app is pretty intuitive.
After authenticating, you'll find some pre-filled categories: :

Inside each category, you'll find some pre-filled dedicated notes (you can access it by clicking on category) :

If you are in a hurry (or feel confortable with yours notes), you can directly quiz yourself by clicking de "Quiz" button. A modal chat will appear with a relevant question about your note :

The app is now more feature-rich than ever ! It includes full CRUD operations for both categories and notes, along with powerful tools like the multi-select mode for targeted LLM quizzing with a contextual feedback loop ensuring the LLM references its previous conclusions to maintain consistency across conversations. I invite you to explore the project yourself, you can find a deep dive into the mechanics and capabilities on the "Learn More" and "Features" pages (available before authentication).
Try It Out (password for demo account : N7!qA3@Zp#L9mE$R)
You can try it out here without signing up. Just click "Demo Account" with the provided password and start creating notes and quizzing yourself (using the single or multiple-selection feature).
Why Active Recall?
Active recall is one of the most effective learning techniques backed by cognitive science. Instead of rereading notes (which feels productive but isn't), you actively retrieve information from memory. Echoflow automates this process with AI.
The Process: From Idea to Deployment
The BMade Method
I started with the BMade methodology, which breaks down the development process into three phases :
- Brainstorm - Define the problem, target audience, and core features.
- Map - Design the architecture, tech stack, and data flow.
- Make - Implement the features iteratively, testing and refining as you go.
GitHub Copilot CLI strictly followed this planβno feature creep, no random refactors. Just the PRD, executed step by step.
The Tech Stack
βββββββββββββββββββββββββββββββββββββββ
β π Data Acquisition (Input) β
β ββ Next.js Web Application β
β ββ Markdown/Rich-text Editor β
βββββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββ
β π Auth & Persistence β
β ββ Supabase (PostgreSQL + Auth) β
β ββ Row Level Security (RLS) β
β ββ Vector Storage β
βββββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββ
β π€ AI Logic (Echoflow) β
β ββ OpenRouter API Gateway β
β ββ Prompt Engineering (Context) β
β ββ Streaming Responses (SSE) β
βββββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββ
β π Active Recall (Output) β
β ββ Dynamic Quiz Interface β
β ββ Smart Search Tags (JSON) β
β ββ Token/Energy Quota Tracking β <- I haven't implemented this yet.
βββββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββ
β βοΈ Infrastructure β
β ββ Vercel (Edge Functions) β
β ββ Monorepo (Next.js + Scripts) β
βββββββββββββββββββββββββββββββββββββββ
The "Orchestrator" Strategy: Controlling the AI Flow
To ensure Echoflow remained consistent, scalable, and professional, I didn't just "chat" with the AI β I engineered its environment.
I implemented a strict instruction framework via the .github/copilot-instructions.md file. This file acts as the project's source of truth and enforces the following:
- Strict Tech Stack Enforcement β Ensures the project follows the chosen stack (Next.js 15 App Router, Tailwind CSS 4.0, and pnpm).
- Architectural Integrity β Defaults to Server Components where appropriate and enforces Zod validation for API routes.
- The BMade Workflow β Integrates the BMade methodology into the AI prompts to preserve a "think-before-code" approach.
- Commit Discipline β Ensures each change is well-documented and traceable.
MCP: Context7 Integration for Real-Time Documentation
I've also set up the Context7 MCP (Model Context Protocol) with the /mcp command. Context7 injects real-time, version-specific documentation into the AI's context whenever I mention a library or framework. This provides several benefits:
- Real-time documentation access β the AI can fetch official docs for 60,000+ libraries instantly.
- Reduced hallucinations β Context7 supplies the exact API syntax for the target version, avoiding outdated or incorrect guidance.
- Version-accurate examples β it returns code examples that match the versions I'm using (e.g., Next.js 14+), preventing deprecated patterns.
-
Smoother workflow β no manual copy/paste: the AI retrieves the relevant documentation automatically when prompted (for example with
use context7).
By defining the rules of the game before the first line of code was written, I turned the AI from a simple assistant into a specialized senior engineer tailored for Echoflow.
SKILL.md: A Living Document of AI Feedback Analysis
I rencently learned about the concept of a SKILL.md file from this insightful thread. The idea is to have a dedicated file where I analyze the AI's feedback on my code, identify patterns in its suggestions, and reflect on how to improve my prompts and code structure.
I've always wanted to try this.. so i installed some relevant skills :

A Humble Disclaimer
I'm a learner. This is my first real full-stack project, and I'm sure there are "creative architectural choices" in here that would make senior devs cringe π
Some things I'm still figuring out:
- Could my database schema be better normalized ? Maybe.
- Could my analysis of AI feedback be more sophisticated ? Definitely.
- Are there edge cases I haven't considered ? Absolutely..
- Is security airtight ? Probably not enough (but I did implement RLS and follow best practices π€).
I probably made some "questionable" decisions in the codebase. But for a first try, I'm pretty proud of the result and the process I went through. The goal was to learn, experiment, and create something useful, not to create a perfect codebase.
If you find bugs or have suggestions, I'm all earsβteaching me is teaching the next generation of devs (but teach me first please).
The Power of "Vibe-Coding"
Here's what I learned: GitHub Copilot CLI enables 100% "vibe-coding" when used correctly.
What does that mean?
Instead of:
- Thinking of a feature
- Opening an editor
- Writing code
- Switching to terminal to test
- Switching back to fix bugs
- Repeat 47 times
I could just :
- Think of a feature
- Describe it to Copilot CLI
- Watch it implement, test, and iterate
- Stay in the flow
To be honest, I did fix a few bugs manually. Sometimes, hopping into the code editor often with a quick assist from Copilot Chat was faster than explaining the context to the CLI. While I had to switch contexts a few times, the overall time lost was negligible compared to the massive speed boost the CLI provided.
The "Flow State" Advantage
When you stay in the terminal:
- No context switching between IDE and browser
- No manual file navigation
- No "where was I?" moments
- Just pure creation mode
It's like pair programming with someone who never gets tired, never judges your questions, and always has the docs ready (but you still have to understand the docs yourself).
By the Numbers (GitHub Copilot CLI analysis by the DEVELOPMENT_LOG.md)
- Development Time: ~12 hours (spread over 5 days + 1 last to finalize)
- Lines of Code: ~5,000 (TypeScript + SQL)
- Copilot CLI Features Used: 30+
- Coffee Consumed: βββ ... (too many)
- Stack Overflow Visits: 2 (down from ~50 per project)
- "Aha!" Moments: Countless
What I Built
| Feature | Status | Notes |
|---|---|---|
| π Authentication (Google OAuth) | β | Supabase Auth with RLS |
| π Note Management (CRUD) | β | Markdown support + syntax highlighting |
| ποΈ Category Organization | β | Color-coded, filterable |
| π€ Single-Note Quizzing | β | AI asks questions, you answer |
| π― Multi-Note Quizzing | β | AI quizzes across multiple notes |
| π¬ Conversational AI | β | Chat-based learning |
| π Progress Tracking | β | AI analyzes analysis/weaknesses |
| π¨ Dark Mode UI | β | Glassmorphism design |
| π Streaming Responses | β | Progressive AI rendering |
| π Demo Account | β | Try without signing up |
Technical Highlights
1. AI Model Rotation System
// For freemium accounts (default) : Automatically tries free models until one succeeds
const FREE_MODELS = [
"meta-llama/llama-3.2-3b-instruct:free",
"z-ai/glm-4.5-air:free",
"stepfun/step-3.5-flash:free",
"meta-llama/llama-3.3-70b-instruct:free",
"qwen/qwen-3-235b-a22b:free",
"mistralai/mistral-small-3.1-24b:free",
"google/gemma-3-4b-instruct:free",
];
// For premium accounts (even for the "Demo-account")
const PREMIUM_MODELS = [
"openai/gpt-4o-mini:paid", // generally this one is used
"mistralai/mistral-7b-instruct:paid",
];
2. Streaming SSE Implementation
// Progressive AI response rendering
const stream = new ReadableStream({
start(controller) {
reader.read().then(function push({ done, value }) {
if (done) return controller.close();
controller.enqueue(value);
return reader.read().then(push);
});
},
});
3. Row Level Security (RLS)
-- Users can only see their own notes
CREATE POLICY "Users can view own notes"
ON notes FOR SELECT
USING (auth.uid() = user_id);
4. JSONB Feedback Storage
// Flexible AI feedback structure example
ai_feedback: {
analysis: "The user struggles with recall of key concepts, but shows strength in understanding relationships between ideas.",
weaknesses: "Really stupid at X topic.",
conclusion: "I need to review X topic more often, maybe with a different approach.",
}
What I Learned
About AI Coding Tools
- They don't make you "lazy"βthey make you productive
- The quality of output depends on the quality of input (prompt engineering matters)
- Staying in the terminal is way more efficient than I expected
I've also learned how to prompt correctly ! With the right context, the CLI enables 100% vibe-coding.
For a skilled developer (or an aspiring one), the real limit isnβt how fast you can type but how clearly you can communicate your intent.
Here's an example of a part prompt i used with :

We can't see well in this screen but my method is :
- Explain the idea (or issues) precisely
- Share the relevant file using the "@" symbol
- Add details if needed (like links) Yeah i'm not professional π but that was my "prompt cycle".
But it's important to have a good understanding of the underlying code and architecture. It's not about blindly accepting AI suggestions, but about guiding the AI with clear prompts and making informed decisions based on its output.
About programming in general
This challenge was a deep dive back into the frontend ecosystem. After focusing heavily on Backend architecture during my studies, this project was the perfect opportunity to bridge the gap and sharpen my React skills.
Modern Stack Discovery : It was my first time getting my hands dirty with Next.js 15 and shadcn ui. The speed of iteration and the "developer experience" they provide are truly game-changing for a solo builder.
Infrastructure & Auth: Integrating Supabase felt seamless, though setting up Google OAuth provided a rewarding challenge in managing secure authentication flows.
AI Orchestration: Finally using OpenRouter provided the model flexibility Iβve always wanted. I had to learn Prompt Engineering on the flyβit was fascinating to see how subtle tweaks in instructions could drastically shift the AI's behavior (you can see my logic in
api/ai/generate-questions/route.ts). There is something almost artistic about controlling AI output through code, and it made me deeply appreciate the importance of clear, structured communication with LLMs.
I can already see the immense potential for creating highly customized AI interactions in the future.
What's Next?
After a long break, I will continue improving Echoflow based on user feedback, and I plan to add more features based on the best practices to learn faster and more effectively with AI.
Final Thoughts
The ultimate irony : I spent my time learning development, only to build a tool that helps me learn even more π€π..
Made with π (and lots of β) by Adam
Powered by GitHub Copilot CLI π




Top comments (0)