DEV Community

Cover image for Desktop Music Player Using QT for Python - Built with Github Copilot CLI
Charlotte Towell
Charlotte Towell

Posted on

Desktop Music Player Using QT for Python - Built with Github Copilot CLI

GitHub Copilot CLI Challenge Submission

This is a submission for the GitHub Copilot CLI Challenge

What I Built

I have been really into media ownership lately. I have started digitising my vinyl collection this year (using a raspberry pi!) with the goal of abandoning Spotify. Through this, I discovered foobar2000 - a freeware audio player for local music files.

While it's good, I decided I wanted to customise my own player to have full control over the style and keep the functionality limited to my needs. Yes I could make a custom skin, but I've never ventured into desktop development before and in the current age of AI fully customised software is more accessible than ever so why not make my own from scratch!

For this challenge, I built a customised audio player desktop application - specifically with an audio visualiser feature so that I can play it on a mini screen in my PC set-up.

Demo

Some of the key features of the application are:

  • viewing your library by album, artist, year or folder
  • integrated media controls using keyboard shortctus like space bar for play & pause
  • queue management including the ability to go back to previously played tracks
  • detecting most supported audio files from a set destination folder
  • the audio visualiser + mini-player which are my personal favs
  • both windows & linux support (or at least Raspberry Pi OS which is what I tested myself)

You can check out the github repo here:

GitHub logo charlottetowell / desktop-music-player

A desktop music player for local audio files - For the Github Copilot CLI Challenge

Desktop Music Player

A desktop audio player for Windows & Linux with real-time visualization, built with Python and Qt.

Built for the Github Copilot CLI Challenge hosted by dev.to. See my entry blog + demo here: Desktop Music Player Using QT for Python - Built with Github Copilot CLI

In this README:

Features 🎢

  • Cross-Platform: Windows & Linux support
  • OS Media Key Integration: Control playback with keyboard media keys
  • Mini player: A pop-out mini player window with audio waveform visualisation
  • Audio Playback: Full playback engine with controls and playback history
  • Queue Management: Add, reorder, and remove tracks with drag-and-drop
  • Library Scanner: Auto-discover audio files with metadata extraction (MP3, FLAC, WAV, OGG, M4A, AAC)
  • Real-Time Audio Visualizer: Waveform display with smooth animations

Tech Stack πŸ’»

  • UI: PySide6 (Qt for Python)
  • Audio…

My Experience with GitHub Copilot CLI

tldr; lots of planning for context + very autonoumous copilot == fast iteration to usability!

My usual experience with AI-assisted coding is in more of a hands-on, keep a short leash kind of methodology. Iterating fast on very small scoped micro-tasks back and forward with the models.

For this project, to give Github Copilot CLI a real test, I approached it very differently by essentially staying in planning & design mode and letting it do all the heavy lifting with programming itself.

I have also never built a desktop application before. Most of my work is in web development or backend scripts so it was exciting to see how fast this brand new tooling can come together with the help of AI.

I approached my workflow with Github Copilot CLI in four stages:

  1. Planning & Design + Initial Project Setup
  2. Rapidly Building Out Features & Functionality (Ignoring UI)
  3. Design Refinement - Layout & Styling Polish
  4. Finalisation - Build + Documentation

Day One πŸ—“οΈ

On day one, I spent my time drafting up the features I wanted to include, and working on a rough (to be replaced) design in figma. I used different AI models such as Gemini to teach myself an intro to desktop application development, which frameworks or libraries to use, and common gotchas to consider such as performance, all to inform my context for the project scope.

From this, I wrote my initial system prompt, chucked this back at AI once more, and resulted in my final copilot-instructions.md file.

This initial work set a good stage as I then prompted Copilot CLI with a simple prompt to use its instructions and spin up the skeleton of our project:

As per the repository's copilot instructions, create the initial skeleton of the project including folder structures and a README explaining how to run locally
Enter fullscreen mode Exit fullscreen mode

Empty repository with github copilot

Day Two πŸ—“οΈ

Day two was all about building out the core functionality and my feature wish-lists. I ignored design for the moment here and focused purely on functionality. This consisted of giving Copilot more in-depth prompts about each specific function and then letting it roll to build it out in full.

This was similar to my usual back and forward approach with AI but I found Copilot was competent enough to complete features in full a lot of the time when given a detailed prompt, with often only one follow-up needed to address any bugs found from a short test.

Day 2 Functiionality Screenshot

Day Three πŸ—“οΈ

Day three was when I decided I was done with my initial design in figma and wanted to entirely do over. Which was great considering I spent the prior session ignoring design anyway. I did however keep a peach element in the logo as a nod to my original idea.

After creating a mock-up of my new design, I then uploaded screenshots to my repo and instructed Copilot to update the styling to match.

My findings here is that Copilot is not fantastic at handling an overall large design brief and then applying globally to the application. However, when briefing smaller components at a time (eg. left panel, middle panel, right panel) and complementing with some written context, it does a passable job at scaffolding out the desired layout to match the design, and certainly better to prompting a model with no image input.

Screenshot of figma designs

Finally πŸ†

At last, once the app was in a good-enough looking state for me, I used Copilot to help with the final project tasks such as compiling the application for download, and creating clear documentation. It even managed to debug some installation issues including with the librosa library using lazy loading which wasn't compatible with our built application.

For installation, Copilot generated a build.spec file and windows + linux scripts to build the application using PyInstaller.

Summary

This was a fun little project to make, and not something I would have attempted without AI in a format I'm unfamiliar with. Although there is a million features & improvements I could add, I am quite satisified with it meeting the use case I wanted which was ultimately to display a cute waveform on a screen in my PC set up. I think the power of AI is not just in commercial use-cases, but in enabling personal side projects like this to rapidly get to a point of completion where you can create fully customised software for everyday life.

Image of my desk with desktop music player on mini screen

Top comments (0)