In 1979 Factory Records released their 10th album Unknown Pleasures by Joy Division. The artwork is credited to both the band & Peter Saville. The album cover uses an image of radio waves from pulsar CP 1919. The background is black (instead of white) because Peter Saville said "I was convinced that it was just sexier in black". It is considered one of "the best albums of all time".
I'm obsessed with this album. So much so that for some time I've been trying to make an application that would use the albums song's as the data for the waveform. Recently I've started working on it again & with some help from Claude, a path was made.
The journey here has been filled with many obstacles that span over a decade. I've tried using Flash to process the sound & draw SVGs for the graph. I got ~25% of the structure working before I hit a processor wall. Flash was such a processor hog, that the combo of processing audio & drawing SVGs crashed the browser. It was so bad that I stopped coding & re-read Steve Job's open letter "Thoughts on Flash".
I tried again when the Web Audio API was first released. While it solved some problems, I could not get pass the processor wall. In my frustration I decided to let this one go.
Until recently…
I'm a big fan of Claude AI. The developer experience is amazing. So amazing that I dusted myself off from the previous failures and typed:
"Create an app that takes an audio file and interprets the sound to look like Joy Division's Unknown Pleasures album art"
Within seconds it gave me a working structure. As I ran the initial code, I realized that Claude AI put me back exactly where I left off. I was so excited that I remember screaming "Holy S#*t!". It was the first time in a long time that the computer made me scream with joy instead of frustration.
Day 1
Claude AI guided me through the transformation of a standard audio visualizer into a stylized waveform effect. We started with a basic frequency visualization and methodically reshaped it through a series of targeted modifications. The process involved implementing stacked waveforms with proper perspective, adding time-based history effects, and fine-tuning the symmetry of each line to create the distinctive ridge-like patterns characteristic of the album art.
The collaboration exemplified an iterative design approach, with each adjustment building upon previous changes to refine the visualization. We focused on technical aspects like frame rate control, line opacity, and spacing while simultaneously addressing aesthetic considerations. The final result balanced visual fidelity to the inspiration with practical performance considerations, creating a functional audio visualizer with distinctive artistic flair.
Key modifications included:
- Implemented stacked waveform patterns with perspective scaling
- Created a time-history buffer to display waveforms from different moments
- Added symmetry to ensure waveforms begin and end at the same position
- Tuned frame rate to 30fps for smooth animation
- Adjusted line thickness, opacity, and spacing for visual clarity
- Fine-tuned dimensions to 300px × 600px with proper padding
- Applied smooth curve interpolation for natural-looking waveforms
- Optimized the number of lines (63) to create the ideal density
Day 2
We began this day with the file structure from the previous day which Claude AI itself labeled as "suboptimal JavaScript mixed into an HTML file."
Through collaboration, we transformed the project into a well-structured application with properly separated HTML, CSS, and JavaScript files. We improved the user experience by adding play/pause controls, loading indicators, and a smooth fade-out animation effect when stopping playback. Additionally, we optimized font rendering with modern techniques, adjusted the visualization dimensions, removed unused code, and provided options for customizing the line appearance. The days final product maintained the album's visual aesthetic while being more maintainable, user-friendly, and performant.
Key improvements:
- Externalized JavaScript code into a separate index.js file
- Added proper audio handling with user-initiated playback (addressing autoplay policies)
- Implemented elegant fade-out animation for waveforms on stop
- Optimized font rendering with advanced CSS techniques
- Cleaned up unused variables and added customization options for line width
- Improved the responsive design and overall visual aesthetics (Claude AI excelled here)
- Enhanced error handling and user feedback throughout the application
Day 3
We started with an audio visualizer that loaded audio on initialization and displayed a standard "Play" button once loading completed. The original code showed the button immediately on page load with a simple text label, without any special animations or transitions. When clicked, the visualizer would display animated waveforms synced to the audio, with a basic toggle between "Play" and "Stop" button states.
Through our modifications, we enhanced the user experience by:
- Making the button initially invisible and fading it in only after audio loading completes
- Adding a sequential text animation where each letter of "Play" or "Stop" appears one at a time
- Optimizing the animation timing for a smoother effect
- Improving the code structure with proper async/await patterns
- Moving DOM initialization to after content loads for better reliability
- Fixing JSX compatibility issues for modern bundlers like Vite
The end result is a polished, engaging interface where the button appearance and text reveal are choreographed animations that enhance the visual appeal of the audio player, while maintaining all the functionality of the waveform visualization.
Through our iterations, we were able to transform my idea into a polished application.
Claude AI was a huge help without a doubt. In the first 2 days I had many moments of pure joy & amazement. I was in a state of creative flow that I rarely find in software development. I liken it to painting, composing or creative writing. I didn't feel any walls closing in, instead a vast and open space where anything is possible. All I needed was my imagination.
On the third day the walls started closing in. The creative flow was off, the code was error filled. This is when I stopped the collaboration and finished it on my own. Lucky for me I was ~80% done & the remaining work was mostly testing, fine tuning and UI tweaks!
Conclusion
I was extremely happy with Claude's developer experience. As my prompts grew in complexity, Claude kept learning and giving some amazing recommendations. Some of which I was not even considering until Claude suggested it. The gradual opacity shift of the waveform. It was a small tweak, but it had a major impact in the UX.
My only note that I can share with you about this exercise, is that you can't expect AI to "get it right" from one prompt. It took me ~3 days and the willingness to rework my prompt strategies before I found some amazing moments.
I highly recommend following the approach I've outlined if you need to do any rapid prototyping, or vetting of creative idea's where integrating various technologies is part of the challenge. Especially now that Clade AI has released v3.7 Sonnet & Claude Code. The developer experience (at the time of writing) is amazing and I believe these tools to be a must for any creative technologist or software engineer with a big imagination.
Thank you & I hope you find my use case helpful.
Antonio Almena | https://antonio.almena.io
Top comments (0)