DEV Community

Cover image for Jurnal: A Voice-First Journal App with On-Device Whisper + LLM Note
pizidavi
pizidavi

Posted on

Jurnal: A Voice-First Journal App with On-Device Whisper + LLM Note

GitHub Copilot CLI Challenge Submission

This is a submission for the GitHub Copilot CLI Challenge

What I Built

Jurnal is a simple mobile journaling app that makes it easy to capture "what happened today" using your voice. Instead of starting from a blank page, you record a short voice note, and the app turns it into a clean, structured journal entry you can skim later.

Jurnal is a Romanian term that translates into English as diary or journal.

The core flow is:

  • Record audio in-app
  • Transcribe locally on-device using a Whisper model
  • Send the transcription to a larger LLM to turn it into a real note, with sections like summary/highlights/intentions in Mardown

What I like about this approach is the combination of privacy-first transcription with the "writing assistant" part happening after: I still speak naturally, and the app handles organizing the content into something readable.

Teck stack:

  • React Native with Expo
  • react-native-executorch to handle local transcription model
  • @tanstack/ai to provide online LLM utility
  • TailwindCSS by Uniwind for styling

Future improvements I'm excited about:

  1. Custom templates for different styles
  2. Choosing custom models for transcription and enrichment
  3. Manual editing of the final note in-app
  4. Smarter search with local RAG
  5. A background queue for long-running processing

Demo

πŸ”— Repo: https://github.com/pizidavi/jurnal-app

Home Note Recording Settings
Home screen Recording new note Note screen Setting screen

My Experience with GitHub Copilot CLI

Copilot CLI was most helpful when I treated planning as a first-class step, not a formality. The biggest productivity boost came from writing a tight plan, implementing in small slices, and then letting the AI validate (and correct) itself.

To support that workflow, I made the project intentionally strict: ESLint rules are restrictive, TypeScript is in strict mode with other custom strict types, and I pushed the assistant to run lint and typecheck command before considering anything as done. That created fast feedback loops and prevents subtle regressions from piling up.

I also built a custom Orchestrator agent that follows a plan β†’ implement β†’ review cycle and repeats until the plan is completed. It was really fun to see the agents correct themselves when they missed a step or introduced a bug.

I'm extremely positive about the future of Copilot CLI: the UI is good and the interaction model fits real development.
One thing I'd love to see next is a persistent allowlist for safe commands (like lint/typecheck/format), so that I don't have to approve them every time.

Have fun!

Top comments (1)

Collapse
 
notfritos profile image
Ryan Lay

I think it's pretty awesome that you created a custom orchestrator for this project! A nice transcription formatter is a pretty rad use case too! Can't wait to see where it goes next!