DEV Community

Stefanos Kouroupis
Stefanos Kouroupis

Posted on

A simple audio sequencer using Web Audio Api & Angular

I decided to move my posts from medium to here, for various personal reasons. As I move them I also going to delete them from there, no reason keeping both. I am starting with the least popular one. :D

As most developers, I spend my free time developing… or gaming (ahem), or since I became a parent …more seeking opportunities for sleep and less developing.

Nevertheless this time I decided to created a simple minimal audio sequencer.

So let’s pretend that I actually gave a bit of though to this project and wrote down my requirements before I started developing.

  • Visualise a sequence of n elements that represent different sounds
  • The elements can be turned on and off (muted/change color)
  • A start button that will execute the sequence
  • Change color as each element is executed
  • When done reset the state and colors of each element

Why Angular? Why not. It’s just a personal preference.

My app has one component and one service.

> sequencer.component.ts
> sound.service.ts

My models are the following

interface Block {
    color: string; // hex color
    state: true; // true = sound, false  = no sound
    note: Note;
interface Note {
    name: string; // name of the note
    frequency: number; // frequency (hertz) of the note
    position: number; // I don't use it but its useful to know
                     // the octave this frequency corresponds to

…and now that the hard work is done, we can move the easy part implementing the idea.

First our service.

export class SoundService {
  // initially i had like 6 octaves but it was pretty pointless
  // so I trimmed it down to 2 (octave 4 and 5)
  public notes = [
    name: 'C',
    position: 4,
    frequency: 261.63
  }, {
    name: 'C#',
    position: 4,
    frequency: 277.18
  }, {
    name: 'D',
    position: 4,
    frequency: 293.66
  } ... ] // I am not going to list all the notes
  private audioCtx = new (window['AudioContext'] || window['webkitAudioContext'])();
  private gainNode = this.audioCtx.createGain();
  public play(freq, time, delay) {
    const oscillator = this.audioCtx.createOscillator();
    oscillator.type = 'sine'; 
    oscillator.frequency.value = freq;
    oscillator.start(this.audioCtx.currentTime + delay);
    oscillator.stop(this.audioCtx.currentTime + delay + time);

Nothing fancy we define the following variables.

  • a note 🎶 object with the corresponding frequencies
  • the audio context 🎹 which processes the signals
  • the gainNode which controls the volume 🔊

And finally our Play function. I am using the play function in a bit of a odd way. Instead of sending a note when I want the note to be executed, I am sending a note with a delay, so I can send the entire sequence. The downside of doing it this way is that the sequence cannot been stopped (except of course if you had the references from the oscillator objects).

Our template (sequencer.component.html) is really simple…to the point of being silly 🔥

<div class="pad">
  <div *ngFor="let block of blocks;let i = index" class="block">
    <div class="single-block"
         [ngStyle]="{'background-color': block.color}"
      {{ }}
<button class="btn" (click)="play()">start</button>

And now to our main bit the sequencer.component.ts!

We define some variables

  selector: 'sequencer-pad',
  templateUrl: './sequencer.component.html',
  styleUrls: ['./sequencer.component.css']
export class PadComponent implements OnInit {
  public blocks: Block[] = [];
  private blockSize = 13; // sequencer will use 13 notes
  private noteLength = 1; // duration of the note (1 second)
  constructor(private soundService: SoundService) { }
  • blocks are our building blocks 😃
  • blockSize is how many of the notes we defined in the service we wish to use in the sequence. Bare in mind I am doing it this way to sound less boring. If I wanted to create a more realistic sequencer, I would probably have a collection of Block arrays, each array having a unique sound.
  • noteLength, this is basically the duration of the sound 🎵 produced. 1 second should be fine in this case.
  • On ngOnInit() I create my block array
ngOnInit() {
  // add default values to the blocks array
  for (let index = 0; index < this.blockSize; index++) {
      color: 'limegreen',
      state: true,
      note: this.soundService.notes[index]
  • when you click a note you need to change its color and state
 * change the color of the div, and switch its state (on/off)
 * @param index
public changeState(index: number) {
  this.blocks[index] = (this.blocks[index].color === 'limegreen') ?
    color: 'tomato',
    state: false,
    note: this.blocks[index].note
  } : {
    color: 'limegreen',
    state: true,
    note: this.blocks[index].note
  • when the sequence finishes we need to reset the colors and state
 * when sequence ends this returns the colors but to limegreen
private resetColor() {
  this.blocks.forEach(element => {
    element.color = 'limegreen';
    element.state = true;
  • Play the sequence and color it appropriately
 * play the notes that have a true state
public play() {
  this.blocks.forEach((element, index) => {
    if (element.state) {
      const note = this.soundService.notes[index];,
      this.noteLength, index * this.noteLength);
    // this is to emulate the progress
    setTimeout(() => {
      element.color = 'lightpink';
      if (index + 1 === this.blocks.length) {
        setTimeout(() => {
        }, this.noteLength * 1000);
    }, this.noteLength * 1000 * index);

So like I said before I am iterating the block array and sending each note to the soundService with the appropriate delay. Because the way I designed it, I have no feedback from the soundService when a tone starts and ends, I had to use setTimeout (twice) to emulate the progress of the sequence in the UI.

Bonus: the styles I used

.pad {
margin-top: 20px;
display: flex;
flex-direction: row;
.single-block {
flex: auto;
min-height: 40px;
min-width: 40px;
display: inline-block;
border: none;
margin-right: 5px;
cursor: pointer;
border-radius: 5px;
text-align: center;
border-color: green;
border-style: solid;
.btn {
margin-top: 20px;
color: black;
background: #ffffff;
text-transform: uppercase;
padding: 20px;
border: 5px solid black;
border-radius: 6px;
display: inline-block;
.btn:hover {
color: #ffffff;
background: green;
transition: all 0.4s ease 0s;

Top comments (8)

jomopipi profile image
JomoPipi • Edited

Nice article :D
Instead of hard-coding each note's frequency, it might be less work to start with some frequency and just keep multiplying it by the 12th root of two to get the next one.

const TR2 = 2 ** (1.0 / 12.0)
const frequencies = [...Array(12 * nOctaves + 1)].reduce(notes => 
    (notes.push(notes[notes.length-1] * TR2), notes),[110])
Enter fullscreen mode Exit fullscreen mode
panfluterman profile image

Please you have a demo online?

elasticrash profile image
Stefanos Kouroupis
elasticrash profile image
Stefanos Kouroupis • Edited

No, but I could push the current branch into GitHub pages at some point. I'll let you know

mahen23 profile image
EDGE Neural Networks

Can you please post the whole project to git or something because its quite difficult to follow.

elasticrash profile image
Stefanos Kouroupis • Edited

mind you it's not exactly the same, but nearly 90% similar

as bonus I updated it to angular 8 (from 6)

mahen23 profile image
EDGE Neural Networks

I've never seen this line before:

new (window['AudioContext'] || window['webkitAudioContext'])();

Can you please expand more on how to access HTML5 apis via Angular ? We've been trying to use filesystem api via NodeJS and its quite painful, especially through electron. Thanks.

Thread Thread
elasticrash profile image
Stefanos Kouroupis • Edited

That line is nothing more, than basic backward compatibility with the older implementation of the AudioContext. The web Audio standard was first implemented in webkit under the official specifications. New implementations follow specifications to the letter.

Basically whichever of those two evaluates to true, gets used.

I've never done any electron work, but as far as I know its a combination of nodejs and the Chromium engine.

how can you access the window object though through electron is beyond me. A quick search came back with the following

let myWindow = new BrowserWindow(params);
    .then(result => console.log(result));