DEV Community

Vinicius Senger
Vinicius Senger

Posted on

Building Your Own AI-Powered StreamDeck with re:Button

Physical Computing Meets AI!

Imagine creating your own keyboard but instead of regular keys, it triggers smart shortcuts in your computer that helps you to operate your life. That's the re:Button project: it transforms any physical MIDI controllers into intelligent command centers for your digital life, similar to StreamDeck, but open-source, DIY and much more simple and powerful.

In this post I will show how I built my own re:Button "StreamDeck" device using a Presonus ATOM (US$ 136) but you can use any other MIDI device, including a regular musical keyboard.

What You Can Do with re:Button

Here are some real examples from my setup:

  • Press a button to hear your bank balance via text-to-speech
  • Turn a knob to control system volume
  • Launch projects in Kiro with dedicated buttons for each workspace
  • Control Spotify (play, pause, next track) without leaving your keyboard
  • Trigger workflows like taking a photo, checking calendar, or sending MQTT commands to smart home devices
  • Type common responses ("yes", "no", "thanks") with a single button press

The possibilities are endless because you can map any button to any shell command, AppleScript, or Python script.

Check the running demo in this video:

Step-by-Step: Building Your re:Button

1. Get Your MIDI Controller

I chose the Presonus ATOM because it has:

  • 16 velocity-sensitive pads
  • 8 knobs for continuous control
  • 20 additional buttons
  • USB connectivity (no drivers needed)
  • Solid build quality

But you can use any MIDI controller: a $30 USB keyboard, a DJ controller, or even build your own with Arduino.

2. Plug in the USB Device

Connect your MIDI controller via USB. On macOS and Linux, it should work immediately. On Windows, you might need to install drivers (check your device manufacturer's website).

To verify it's working, you can list MIDI devices:

python3 -c "import mido; print(mido.get_input_names())"
Enter fullscreen mode Exit fullscreen mode

You should see your device listed.

3. Using Kiro CLI to Create the First App

I used Kiro CLI, an AI-powered development assistant, to help build the initial version. Here's how:

kiro-cli chat
Enter fullscreen mode Exit fullscreen mode

Then I asked:

"Help me create a Python script that listens to MIDI input and publishes the events to MQTT. I want to see the note number, velocity, and event type for each button press."

Kiro generated the first version of rebutton-mqtt.py:

#!/usr/bin/env python3
import mido
import paho.mqtt.client as mqtt
import json

MQTT_BROKER = "localhost"
MQTT_PORT = 1883
MQTT_TOPIC = "reos/key/atom"

def on_connect(client, userdata, flags, rc):
    print(f"Connected to MQTT broker: {rc}")

client = mqtt.Client()
client.on_connect = on_connect
client.connect(MQTT_BROKER, MQTT_PORT)
client.loop_start()

midi_port = mido.open_input(mido.get_input_names()[0])
print(f"Listening on {midi_port.name}...")

for msg in midi_port:
    if msg.type == "aftertouch" or msg.type == "polytouch":
        continue
    payload = json.dumps({
        "type": msg.type,
        "note": getattr(msg, 'note', None),
        "velocity": getattr(msg, 'velocity', None),
        "control": getattr(msg, 'control', None),
        "value": getattr(msg, 'value', None)
    })
    client.publish(MQTT_TOPIC, payload)
    print(f"Sent: {payload}")
Enter fullscreen mode Exit fullscreen mode

4. The MQTT Version Came First

Why MQTT? It gave me two benefits:

  1. Discovery: I could see exactly what MIDI events my controller was sending
  2. Integration: I could trigger actions from other systems (Node-RED, Home Assistant, etc.)

Running this script, I pressed each button and noted the MIDI note numbers:

{"type": "note_on", "note": 96, "velocity": 127}
{"type": "control_change", "control": 14, "value": 64}
Enter fullscreen mode Exit fullscreen mode

This told me that pad #1 sends note 96, and knob #1 sends control change 14.

5. Mapping Buttons and Actions with a Markdown File

Instead of hardcoding mappings in Python, I created a simple markdown configuration file: rebutton-config.md

# re:button Config

## Quick Responses
- 109[control_change]=osascript -e 'tell application "System Events" to keystroke "y"'
- 107[control_change]=osascript -e 'tell application "System Events" to keystroke "n"'

## System Control
- 14[control_change]=osascript -e "set volume output volume $value"

## Launch Apps
- 96[note_on]=/Applications/Kiro.app/Contents/MacOS/Electron /Users/vsenger/code/renascence/reos-core
- 97[note_on]=/Applications/Kiro.app/Contents/MacOS/Electron /Users/vsenger/code/renascence/reos-money
- 92[note_on]=open -a "Microsoft PowerPoint"
- 93[note_on]=open -a Firefox

## Media Control
- 86[note_on]=open -a Spotify && osascript -e 'tell application "Spotify" to play'
- 87[note_on]=open -a Spotify && osascript -e 'tell application "Spotify" to pause'

## Smart Shortcuts
- 112[note_on]=balance=$(curl -s http://localhost:8080/entryResource/balance/BofA | jq -r '.balance'); python3 text_to_speech.py "Your balance is $balance"
Enter fullscreen mode Exit fullscreen mode

The format is simple: note_number[event_type]=command

Variables like $value and $velocity get replaced with actual MIDI values.

6. The Main Script: rebutton.py

Again, I used Kiro to generate the parser and executor:

#!/usr/bin/env python3
import mido
import re
import subprocess

def parse_config(filename):
    config = {}
    with open(filename) as f:
        for line in f:
            match = re.match(r'^-\s*(\d+)\[(\w+)\]=(.+)$', line.strip())
            if match:
                note, event_type, command = match.groups()
                config[(int(note), event_type)] = command.strip()
    return config

config = parse_config("rebutton-config.md")
midi_port = mido.open_input(mido.get_input_names()[0])
print(f"Listening on {midi_port.name}...")

for msg in midi_port:
    key = (getattr(msg, 'note', None) or getattr(msg, 'control', None), msg.type)
    if key in config and config[key]:
        command = config[key]
        command = command.replace('$value', str(getattr(msg, 'value', '')))
        command = command.replace('$velocity', str(getattr(msg, 'velocity', '')))
        print(f"Executing: {command}")
        subprocess.Popen(command, shell=True)
Enter fullscreen mode Exit fullscreen mode

That's it! Less than 30 lines of Python.

7. Using Kiro CLI Agent to Improve

The real power comes from using AI to enhance your setup. Here's how I use Kiro:

Discovering new commands:

> "Create a command that takes a screenshot and saves it to my Desktop"
Enter fullscreen mode Exit fullscreen mode

Building complex workflows:

> "Help me create a script that checks my calendar for the next meeting 
   and speaks the meeting title and time"
Enter fullscreen mode Exit fullscreen mode

Optimizing button layouts:

> "I have 16 pads. Suggest an ergonomic layout for: 
   4 project launchers, 4 media controls, 4 quick responses, 
   and 4 smart shortcuts"
Enter fullscreen mode Exit fullscreen mode

Generating API integrations:

> "Write a curl command to get my bank balance from my re:Money API 
   and format it for text-to-speech"
Enter fullscreen mode Exit fullscreen mode

Kiro understands context, so you can iterate quickly:

> "Make that command shorter"
> "Add error handling"
> "Now create a version for my credit card balance"
Enter fullscreen mode Exit fullscreen mode

Installation & Setup

Requirements

pip install mido python-rtmidi paho-mqtt
Enter fullscreen mode Exit fullscreen mode

On macOS, you might also need:

brew install portmidi
Enter fullscreen mode Exit fullscreen mode

Quick Start

  1. Clone or download the re:Button files
  2. Connect your MIDI controller
  3. Run the MQTT version to discover your button numbers:
   python3 rebutton-mqtt.py
Enter fullscreen mode Exit fullscreen mode
  1. Create your rebutton-config.md file
  2. Run the main script:
   python3 rebutton.py
Enter fullscreen mode Exit fullscreen mode

Real-World Usage

After using re:Button for a few weeks, here's what I've learned:

Most-used buttons:

  • Project launchers (I have 4 different workspaces)
  • Volume control knob (so much better than keyboard shortcuts)
  • "Yes/No/Thanks" quick responses (saves typing in meetings)

Surprising use cases:

  • Taking quick photos with Photo Booth for video calls
  • Checking bank balance without opening apps
  • Controlling smart home devices via MQTT

What I'd improve:

  • Add visual feedback (LEDs on the pads)
  • Create profiles for different contexts (work, home, creative)
  • Build a web UI for easier configuration

Why This Matters

re:Button is part of a larger vision called Renascence Computing—rethinking how we interact with computers to serve our lives, not the other way around.

Physical buttons create a tangible interface to your digital life. Instead of context-switching between apps and memorizing shortcuts, you have dedicated controls for what matters most.

Combined with AI (via Kiro), you can rapidly prototype and evolve your interface. The barrier between "I wish I could..." and "Done!" shrinks to minutes.

Get Started

What Will You Build?

The beauty of re:Button is that it's infinitely customizable. Your button layout will be completely different from mine because your life and workflows are different.

What would you put on your buttons? Share in the comments!


This is part of the Renascence Computing project—a movement to rebuild our relationship with technology. Learn more at Renascence Github Repositoy

Top comments (0)