DEV Community

monkeymore studio
monkeymore studio

Posted on

A MIDI Player That Shows You the Sheet Music and Lets You Play Along on a Virtual Piano

Most MIDI players are just glorified MP3 players with a progress bar. You hit play, you hear music, and that's it. But what if you actually want to see what's being played? What notes, what chords, what key signature? And what if you want to play along?

I built a browser-based MIDI and MusicXML player that does all of that. It renders proper sheet music from your file, highlights each note as it's played, displays a virtual piano keyboard that lights up in real time, and lets you switch between 28 different instruments — from grand piano to shanai. All without uploading a single byte to a server.

Try it out on our free online MIDI player.

Why Build This in the Browser?

MIDI files are small, but they're not audio. They're instructions. To hear them, you need a synthesizer. To see them, you need a notation renderer. Most tools only do one or the other. Desktop apps that do both (like MuseScore) are powerful but heavyweight. Online sequencers force you to create accounts and upload your files.

A browser-based player solves all of this:

  • No uploads — your MIDI files stay on your device
  • No installation — works on any device with a modern browser
  • Instant playback — the synthesizer loads in seconds
  • Visual feedback — see the notes on a staff and a piano keyboard simultaneously

The Full Pipeline

Here's what happens from file drop to playback:

Dual Format Support: MIDI and MusicXML

The player accepts two formats. MIDI is the universal language of digital music — compact, widely supported, but not human-readable. MusicXML is the standard for sheet music exchange — verbose, but carries notation semantics that MIDI simply doesn't have (like ties, beams, and articulations).

Parsing MIDI

For MIDI files, we use @tonejs/midi to decode the binary format:

const { Midi } = await import('@tonejs/midi');
const arrayBuffer = await selectedFile.arrayBuffer();
const midi = new Midi(arrayBuffer);
const track = midi.tracks.find((t) => t.notes.length > 0);

const parsedNotes = track.notes.map((n) => ({
  time: n.time,
  duration: n.duration,
  note: n.name,
  velocity: n.velocity,
}));
Enter fullscreen mode Exit fullscreen mode

This gives us a clean array of note events with timing in seconds, note names like "C4" or "F#5", and velocity values for dynamics.

Parsing MusicXML

For MusicXML, we use the browser's built-in DOMParser:

function parseMusicXML(xmlText: string): NoteEvent[] {
  const parser = new DOMParser();
  const doc = parser.parseFromString(xmlText, 'application/xml');
  const notes: NoteEvent[] = [];
  const parts = doc.querySelectorAll('part');
  let globalBpm = 120;

  for (const part of parts) {
    const measures = part.querySelectorAll('measure');
    let divisions = 4;
    let measureTime = 0;

    for (const measure of measures) {
      const attr = measure.querySelector('attributes');
      if (attr) {
        const divElem = attr.querySelector('divisions');
        if (divElem) divisions = parseInt(divElem.textContent || '4');
      }
      const soundTempo = measure.querySelector('sound[tempo]');
      if (soundTempo) {
        globalBpm = parseInt(soundTempo.getAttribute('tempo') || '120');
      }

      const measureNotes = measure.querySelectorAll('note');
      let currentTick = 0;
      let lastNoteTime = 0;

      for (const note of measureNotes) {
        const durationTicks = parseInt(note.querySelector('duration')?.textContent || '0');
        const isChord = note.querySelector('chord') !== null;
        const isRest = note.querySelector('rest') !== null;

        if (isChord) {
          // Same time as previous note, different pitch
          const noteDuration = (durationTicks / divisions) * (60 / globalBpm);
          notes.push({ time: lastNoteTime, duration: noteDuration, note: midiNoteName, velocity: 0.8 });
        } else {
          if (isRest) { currentTick += durationTicks; continue; }
          const noteDuration = (durationTicks / divisions) * (60 / globalBpm);
          lastNoteTime = measureTime + (currentTick / divisions) * (60 / globalBpm);
          notes.push({ time: lastNoteTime, duration: noteDuration, note: midiNoteName, velocity: 0.8 });
          currentTick += durationTicks;
        }
      }
      measureTime += (currentTick / divisions) * (60 / globalBpm);
    }
  }
  return notes;
}
Enter fullscreen mode Exit fullscreen mode

The parser walks through each <part>, then each <measure>, extracting <note> elements. It handles <chord> tags (simultaneous notes), <rest> tags (silence), and <sound tempo> directives (BPM changes). Duration values are converted from MusicXML's division-based ticks into absolute seconds using the formula: (durationTicks / divisions) * (60 / BPM).

Converting MIDI to MusicXML for Rendering

MIDI files don't contain notation semantics. To render sheet music from a MIDI file, we need to generate MusicXML ourselves. The logic is similar to what we use in the MIDI-to-sheet tool:

function generateMusicXMLFromMidi(midi: any): string {
  const ppq = midi.header.ppq || 480;
  const timeSig = midi.header.timeSignatures[0] || { timeSignature: [4, 4] };
  const [beats, beatValue] = timeSig.timeSignature;
  const ticksPerMeasure = beats * (ppq * 4 / beatValue);

  // Group near-simultaneous notes into chords
  const chordTolerance = Math.max(1, Math.floor(ppq / 64));
  const events = [];
  for (const note of sortedNotes) {
    const existing = events.find(e => Math.abs(e.ticks - note.ticks) <= chordTolerance);
    if (existing) existing.notes.push(note);
    else events.push({ ticks: note.ticks, durationTicks: note.durationTicks, notes: [note] });
  }

  // Split into measures
  const measures = [];
  // ...measure splitting logic...

  // Build XML string
  return `<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<score-partwise version="3.1">
  <part id="P1">
    ${measuresXml}
  </part>
</score-partwise>`;
}
Enter fullscreen mode Exit fullscreen mode

This ensures that whether you upload a MIDI or a MusicXML file, the sheet music viewer always has valid MusicXML to work with.

Rendering Sheet Music with OpenSheetMusicDisplay

Once we have MusicXML (either parsed directly or generated from MIDI), we render it using opensheetmusicdisplay (OSMD):

const osmd = new OpenSheetMusicDisplay(container, {
  autoResize: false,
  backend: 'svg',
  pageFormat: 'A4_P',
  cursorsOptions: [{ type: 0, color: '#3b82f6', alpha: 0.25, follow: true }],
});
await osmd.load(musicXml);
osmd.render();
Enter fullscreen mode Exit fullscreen mode

OSMD produces beautiful SVG sheet music. We configure it for A4 portrait format with SVG backend for crisp scaling. The cursorsOptions sets up a blue semi-transparent cursor that follows playback — we'll drive this cursor programmatically.

Pagination

Long pieces don't fit on one screen. After rendering, we detect how many pages OSMD generated and show only one at a time:

const pageDivs = container.querySelectorAll('[id^="osmdCanvasPage"]');
const totalPages = pageDivs.length || 1;
pageDivs.forEach((div, idx) => {
  (div as HTMLElement).style.display = idx === 0 ? 'block' : 'none';
});
Enter fullscreen mode Exit fullscreen mode

Navigation buttons let users flip between pages manually, but during playback, the page flips automatically as the cursor moves.

The Sound Engine: Real Instruments in Your Browser

Most browser-based MIDI players use cheap synthesized tones. We wanted real instrument sounds. The solution is Tone.js with SoundFont samples.

The Instrument System

We define 28 instruments, each mapping to a General MIDI SoundFont:

const SOUND_FONT_BASE_URL =
  "https://cdn.jsdelivr.net/gh/gleitz/midi-js-soundfonts@master/FluidR3_GM";

const SAMPLER_URLS = {
  C2: "C2.mp3", C3: "C3.mp3", C4: "C4.mp3",
  C5: "C5.mp3", C6: "C6.mp3", C7: "C7.mp3",
};

function createSampler(soundFontName: string, volumeDb = -6) {
  return (Tone: any) => {
    const sampler = new Tone.Sampler({
      urls: SAMPLER_URLS,
      baseUrl: `${SOUND_FONT_BASE_URL}/${soundFontName}-mp3/`,
    }).toDestination();
    sampler.volume.value = volumeDb;
    return sampler;
  };
}
Enter fullscreen mode Exit fullscreen mode

Here's the clever part: we only load 6 samples per instrument — C2, C3, C4, C5, C6, C7. Tone.js's Sampler repitches these to cover the full 88-key range. This keeps load times fast (under a second per instrument) while still sounding realistic. The samples come from the public-domain FluidR3_GM SoundFont, hosted on jsDelivr CDN.

Playback Scheduling with Tone.js

Tone.js handles the timing. We group notes into chords and schedule them on a Tone.Part:

const chords = groupNotesToChords(notes);
const partData = chords.map((c) => [c.time, c]);

const part = new Tone.Part((time: number, chord: ChordEvent) => {
  synthRef.current?.triggerAttackRelease(
    chord.notes, chord.duration, time, chord.velocity
  );
  Tone.Draw.schedule(() => {
    advanceCursor();
  }, time);
}, partData);
part.start(0);
Tone.Transport.start();
Enter fullscreen mode Exit fullscreen mode

Tone.Draw.schedule is the bridge between audio time and DOM updates. It ensures the sheet music cursor advances in sync with the audio, even during tempo changes or buffer underruns.

The Virtual Piano Keyboard

The player includes a fully interactive piano keyboard that shows which notes are currently sounding:

<PianoKeyboard
  activeNotes={activePianoNotes}
  instrument={instrument}
  onInstrumentChange={setInstrument}
/>
Enter fullscreen mode Exit fullscreen mode

The keyboard has 36 white keys (C2 to C7) and 25 black keys. It's built with pure CSS — no images, no Canvas, just carefully positioned div elements with gradient backgrounds and box shadows to simulate the 3D depth of real piano keys.

Responsive Sizing

The keyboard adapts to its container width using a ResizeObserver:

const computeSize = useCallback(() => {
  if (!wrapRef.current) return;
  const width = wrapRef.current.clientWidth;
  const wkeyHeight = (width / 36) * 7;
  const bkeyHeight = wkeyHeight * 0.7;
  wrapRef.current.style.height = `${wkeyHeight}px`;
}, []);
Enter fullscreen mode Exit fullscreen mode

White keys are 100/36 % wide. Black keys are absolutely positioned within octave groups at fixed percentage offsets (9%, 23%, 50%, 65%, 79%).

Keyboard Shortcuts

You can actually play the piano with your computer keyboard:

const handleKeyDown = (e: KeyboardEvent) => {
  const keyCode = e.keyCode;
  if (keyCode === 16) { enableBlackKeyRef.current = true; return; }
  let mappedKeyCode = String(keyCode);
  if (enableBlackKeyRef.current) mappedKeyCode = "b" + keyCode;
  playNoteByKeyCode(mappedKeyCode);
};
Enter fullscreen mode Exit fullscreen mode
  • White keys: 1 through M (QWERTY row)
  • Black keys: Hold Shift + same keys
  • Example: T plays C4, Shift+T plays C#4

Active Note Highlighting

During playback, the activePianoNotes array drives the visual state:

const update = () => {
  const current = Tone.Transport.seconds;
  const active: string[] = [];
  pianoNoteEndTimesRef.current.forEach((endTime, note) => {
    if (current < endTime) active.push(note);
    else pianoNoteEndTimesRef.current.delete(note);
  });
  setActivePianoNotes(active);
};
Enter fullscreen mode Exit fullscreen mode

Active keys turn yellow (#facc15), giving you immediate visual feedback of what's being played.

Sheet Music Synchronization: Following the Cursor

The most visually satisfying feature is the sheet music following playback. OSMD provides a cursor API, but we enhance it with note highlighting:

const advanceCursor = () => {
  const cursor = osmdRef.current.cursor;
  cursor.next();
  const notes = cursor.GNotesUnderCursor();

  // Auto-flip pages
  cursor.updateCurrentPage();
  const pageNum = cursor.currentPageNumber;
  if (pageNum && pageNum !== currentSheetPage) {
    showSheetPage(pageNum);
  }

  // Clear previous highlights
  clearNoteHighlights();

  // Highlight current notes in blue
  for (const note of notes) {
    const svgGroup = note.getSVGGElement?.();
    if (svgGroup) {
      svgGroup.querySelectorAll('path, ellipse, rect').forEach((el) => {
        // Save original styles
        el.dataset.osmdOrigStyleFill = el.style.fill || '';
        el.dataset.osmdOrigStyleStroke = el.style.stroke || '';
        // Apply highlight
        el.style.fill = '#3b82f6';
        el.style.stroke = '#2563eb';
      });
      highlightedNotesRef.current.push(note);
    }
  }
};
Enter fullscreen mode Exit fullscreen mode

The highlight system is careful to preserve and restore original SVG styles. When playback moves to the next note group, clearNoteHighlights() walks through all previously highlighted elements and restores their original fill and stroke attributes. This prevents permanent color changes if playback is stopped mid-piece.

Instrument Switching on the Fly

Changing instruments mid-playback is handled gracefully. When the user selects a new instrument from the dropdown:

useEffect(() => {
  const oldSynth = synthRef.current;
  const config = getInstrumentConfig(instrument);
  const newSynth = config.createSynth(Tone);

  const swapWhenReady = () => {
    if (newSynth.loaded) synthRef.current = newSynth;
    else requestAnimationFrame(swapWhenReady);
  };
  swapWhenReady();

  // Delay disposing old synth so active notes can finish
  setTimeout(() => {
    try { oldSynth.dispose(); } catch {}
  }, 3000);
}, [instrument]);
Enter fullscreen mode Exit fullscreen mode

The new sampler loads asynchronously. We swap the reference only when loaded is true, preventing "buffer not loaded" errors. The old synth hangs around for 3 seconds to let any releasing notes finish their decay envelope.

Edge Cases and Robustness

Real-world files are messy. The player handles several common issues:

Chord Detection

MIDI files often have slightly offset note-on times for chords due to quantization or performance timing. We group notes within a 20ms tolerance:

function groupNotesToChords(noteEvents: NoteEvent[]): ChordEvent[] {
  const timeTolerance = 0.02; // 20ms
  for (const note of sorted) {
    const existing = chords.find(
      (c) => Math.abs(c.time - note.time) < timeTolerance &&
            Math.abs(c.duration - note.duration) < timeTolerance
    );
    if (existing) existing.notes.push(note.note);
    else chords.push({ time: note.time, duration: note.duration, notes: [note.note], velocity: note.velocity });
  }
}
Enter fullscreen mode Exit fullscreen mode

Multi-Part MusicXML

The parser walks through all <part> elements, so multi-instrument scores don't lose notes. Everything gets flattened into a single playback timeline.

Empty or Corrupted Files

Both MIDI and MusicXML parsers have validation. If @tonejs/midi throws or DOMParser returns a parser error, the user sees a clear message instead of a silent failure.

Why This Stack?

The architecture is intentionally pragmatic:

  • @tonejs/midi handles MIDI binary parsing — no need to reimplement SMF format
  • opensheetmusicdisplay provides professional notation rendering — no need to draw beams and stems by hand
  • Tone.js + SoundFont delivers realistic audio — no Web Audio oscillator noodling
  • Pure CSS piano keeps the bundle light — no Canvas or WebGL needed for the keyboard

Everything is loaded on demand via dynamic imports. The core page bundle is small; Tone.js, OSMD, and instrument samples only load when the user actually uploads a file.

Try It Yourself

Got a MIDI file from an old game soundtrack? A MusicXML export from Sibelius? Or just want to see what your composition looks like on a staff while hearing it played back with real piano samples?

Upload it to our free online MIDI player. Watch the notes scroll across the sheet music, light up on the virtual keyboard, and hear them played by a real sampled grand piano. Switch to guitar, flute, or music box if you're feeling whimsical. It's all happening right in your browser.

Top comments (0)