It all started 7:30 in the morning. Sunday. I made my breakfast, egg 🥚 in a butty and a cup ☕ of Cardemon Ahmed tea. Peak comfort. Peak genius mode. I was lurking around the internet like a bored raccoon trying to catch what’s new, scrolling, scrolling, scrolling, until my brain suddenly went "yo... build something". And of course I listen to that voice, because that voice has gotten me into some pretty fun trouble before. I wanted to code something new, something silly, something cool, something that would make me go "🤣🤣 look at this weird thing I made".
And then the idea 💡 came back to me, like an old meme you forgot but still love. Ride inside my music. I have tons of unreleased tracks 😎 just sitting on my hard drive collecting dust like ancient artifacts, so why not use those. And I knew Expo and React Native already so boom, decision made. Web, Android, iOS, full cross platform madness. I also wanted tilt controls. Tilt on phone. Mouse on web. Tilt on browser mobile too because why not. Full chaos.
So I opened bolt.new because they support Expo and it gives you a nice base to work with, like a boilerplate but with vibes. I asked it to help me setup the project and even in the very first step I added collision detection because I knew I was going to need it later anyway. Felt good to have that in place early instead of trying to duct tape it at the end like I usually do 😭.
🔊 Audio Analysis: Scan First, Upload Later
The plan was very clear:
- Analyze my tracks locally
- Extract all the frequency bands, envelopes, beats, bpm
- Save that analysis into a database, So the game never analyzes audio again
- Instant geometry loading everywhere
Bolt.new helped me configure Supabase. When everything was ready, I simply changed the database owner to my own Supabase project and it migrated automatically. Honestly one of the smoothest “move to my account plz” moments I’ve ever had.
Then me and my AI helper gremlins wrote a local script to:
- analyze each track
- generate geometry data
- upload that data to Supabase
Only after the analysis finished did the script upload the actual audio files to Supabase Storage. Correct order. Everything clean and organized (for once 😭).
This whole pipeline took maybe 2 hours and suddenly I had a full library of levels ready.
🌀 Drawing the Level: Frequency Torture Testing
Next I had to use the analyzed data to actually shape the tunnel. This part took a lot of testing:
- some frequencies made the tunnel too chaotic
- some were too soft
- some made everything look like melted spaghetti
But bolt.new made testing super fast, push code, it rebuilds, test instantly via Expo QR or web preview. After tweaking values on different tracks, I finally found combinations I liked:
- bass = big movement
- kicks = tunnel squeeze
- treble = tiny flicks
- quiet moments = wide open space
When it started to look and feel fun, I knew I was on the right path.
🎮 Game Logic, Pages, UI, The Gameplay Skeleton
After the tunnel felt good, I moved on to building:
- main menu
- game screen
- scoring
- movement
- collision updates
- restart logic
- loading screen
- small UI fixes
Eventually switched from bolt.new to VS Code when I needed fine control over spacing, padding, margins, and a preloader.
🧠 Using Multi Agents in VS Code for the First Time
This was the first time I really used multi-agent workflows in VS Code, and honestly it clicked for me in a cool way.
I learned:
👉 One agent can “hold the context” for a task
👉 If I needed something totally different, I could open a new agent
👉 That way I didn’t confuse a single agent by jumping topics
So I had different agents working on:
- Agent 1: handling gameplay logic
- Agent 2: adjusting projectile speeds and physics
- Agent 3: fixing UI spacing
- Agent 4: generating helper components
- Agent 5: working on enemy patterns
It ended up feeling like I had a tiny chaotic dev team living inside my editor. Except nobody complained. And nobody asked for coffee ☕🤣
🎨 Getting Graphics: Enter Gemini Nanobanana 🍌✨
Time for visuals. I can’t draw anything, so I used Gemini Nanobanana to:
- generate a color palette
- generate assets using that palette
- export everything
- clean up in Photoshop (crop, resize, remove backgrounds)
- ask VS Code agents to replace placeholder SVGs with the new PNGs
Suddenly the whole game became alive. Colorful. A little too neon. Just how I like it.
⚡ Adding Enemies, Powerups, Shields and Other Fun Stuff
Once the graphics looked good, it was time for more gameplay elements:
- enemies
- powerups
- shields
- visual effects
- hit reactions
- tiny details everywhere
Every time I thought “ok done”, my brain quietly said
“hmmm… but what if… more stuff?”
and then I added more stuff.
🏆 Leaderboard Time
Since levels are endless (the music loops), I wanted high scores and user profiles.
I went back to bolt.new, asked it to set up Supabase Auth, made a few policy changes, and soon:
- users got auto-created
- they could update their display name
- leaderboard worked great
One of the easiest setups of the whole day thanks to AI helping me config everything.
🚀 Final Build, Deploy and Bedtime
Once everything worked:
- built the app
- pushed to GitHub
- deployed the web version on GitHub Pages
- tested on phone
- fixed small bugs
By the time I finished, it was already 22:00. I was tired, happy, a bit confused how fast the day went. I tried the game a few times in bed, smiled at my own weird creation, and fell asleep 😴
Final thoughts
Could I spend more time polishing? Yes.
Fix rendering and loading issues? Sure.
Add more ships, enemies, power-ups? Absolutely.
Turn it into a full commercial game? Maybe one day.
But for a single Sunday coding session, this is insane.
And honestly, I’m super happy with what I built. AI didn’t replace me, it amplified me. Gave me momentum. Removed boring stuff. Let me stay in “flow mode.”
This Sunday was one of my most fun coding days in a long time.
And now I can literally ride inside my own music.
Life = good ❤️🎧🚀

Top comments (0)