DEV Community

Kokis Jorge
Kokis Jorge

Posted on

Stop Clicking MIDI Notes: Automating Creative Block with Python & AI

As a developer and music producer, I’ve always found it ironic that while I automate my deployment pipelines, I still spend hours manually clicking MIDI notes in my DAW (Digital Audio Workstation).
We often talk about "flow state" in coding. In music, it's the same. But nothing kills that flow faster than spending 45 minutes drawing hi-hat velocities or trying to come up with a chord progression when your brain is tired.
I didn't want AI to write the music for me or be a AI Rap Generator. I wanted to build a stack that handles the "boilerplate code" of music production so I could focus on the creative logic.
Here is how I combined an AI MIDI Editor with a custom Python script to reduce my production friction by ~30%.

The Problem: repetitive "Boilerplate" in Music

According to a survey by the MIDI Association, creators spend massive amounts of time on editing rather than composition. In developer terms: we are spending too much time writing configuration files and not enough time writing the core application logic.
I needed a way to:

  1. Generate Scaffolding: Get a basic rhythm or chord structure instantly.
  2. Humanize Programmatically: Apply "groove" without doing it manually for every note.

Step 1: Generating the "Raw Data" (The AI Part)

I started looking for APIs or tools that could generate clean MIDI data. I needed something lightweight—I didn't want a heavy audio file, just the instruction set (MIDI).
I settled on OpenMusic AI for this part of the stack.
(Disclaimer: I’ve been using this tool heavily and integrated it into my workflow because it exports clean MIDI files).
Unlike tools that give you a finished audio loop (which is hard to edit), this tool generates the MIDI patterns. Think of it as create-react-app but for a rap beat or a melody. It gives me the structure, but I still own the code.
The Workflow:

  • Input parameters (Tempo: 140bpm, Mood: Dark, Genre: Trap).
  • Generate a 4-bar loop.
  • Export the .mid file.

Step 2: Processing the Data with Python (The Fun Part)

AI-generated MIDI is often "too perfect." Every note hits exactly on the grid with 127 velocity. It sounds robotic.
Instead of dragging velocity bars manually in Ableton or FL Studio, I wrote a simple Python script using the mido library to "humanize" the AI output before importing it.
Here is the logic:

  1. Load the AI-generated MIDI file.
  2. Iterate through note events.
  3. Apply a randomization function to velocity (loudness) and time (micro-timing).

The Script
You'll need to install mido: pip install mido

import mido
import random

def humanize_midi(input_file, output_file, vel_variance=10, time_variance=5):
mid = mido.MidiFile(input_file)
new_mid = mido.MidiFile()

for track in mid.tracks:
new_track = mido.MidiTrack()
new_mid.tracks.append(new_track)

for msg in track:
if msg.type == 'note_on' or msg.type == 'note_off':
# 1. Randomize Velocity (Dynamics)
# Ensure velocity stays within MIDI range (0-127)
if hasattr(msg, 'velocity'):
variance = random.randint(-vel_variance, vel_variance)
new_vel = max(0, min(127, msg.velocity + variance))
msg = msg.copy(velocity=new_vel)
# 2. Randomize Time (Groove)
# Adding slight ticks to simulate human error
# Note: This is a simplified example. Real timing requires handling delta times carefully.
if hasattr(msg, 'time') and msg.time > 0:
time_jitter = random.randint(0, time_variance)
msg = msg.copy(time=msg.time + time_jitter)

new_track.append(msg)

new_mid.save(output_file)
print(f"Humanized MIDI saved to {output_file}")

Enter fullscreen mode Exit fullscreen mode




Usage

humanize_midi('ai_generated_beat.mid', 'humanized_beat.mid')

The Results: Optimization Metrics

By combining the AI generator for the "idea spark" and Python for the "cleanup," I turned a subjective process into a repeatable workflow.

  • Latency Reduced: Time from "blank project" to "working loop" dropped from ~45 mins to <10 mins.
  • Consistency: I can apply the exact same "humanization algorithm" to different tracks to keep a consistent album feel.

Conclusion

We often fear AI will replace creativity. But in my experience, using AI tools combined with your own scripting capabilities is just like using Copilot or ChatGPT for coding.
It doesn't write the symphony for you, but it handles the boring parts so you can focus on the music.
If you are a dev who makes music, I highly recommend trying to treat your MIDI files like data structures. It opens up a whole new world of production.

Top comments (0)