DEV Community

loading...

How to create a progressive audio player with React hooks

nicomartin profile image Nico Martin ・3 min read

A React hooks audio player

I'm a huge fan of the web as an open platform to distribute software. That's why I'm always looking for new ideas to experiment with upcoming browser APIs. Some time ago I've stumbled upon a Twitter-thread where Aleksej and Jonny were talking about a webapp that would allow you to listen to the audio stream of a YouTube video in the background.

Long story short, I built it:

https://ytaud.io

GitHub logo nico-martin / yt-audio

A ProgressiveWebApp that allows you to listen to youtube videos in the background

The main idea was to create a useful implementation of the share target API. But that was just the beginning. The most interesting part was definitely the audio player. My first prototype was using a plain audio HTML Element. But soon there were quite some requests for a more extensive audio player.

useAudio

I've written the whole app with React (using Preact under the hood) and since I'm a big fan of React hooks I thought it would be a good idea to outsource the player to a custom useAudio-hook.
I quickly found a great inspiration on GitHub where Vadim Dalecky published this huge library of React Hooks. I really like his implementation, but there were some features missing and I thought I could simplify some things.

One of the most important things is the separation between state (the current state of the player) and controls (which are used to interact with the player).

So in the end I had a useAudio-hook that looks like this:

// useAudio.jsx
import React, { useEffect, useRef, useState } from 'react';

const parseTimeRange = ranges =>
  ranges.length < 1
    ? {
        start: 0,
        end: 0,
      }
    : {
        start: ranges.start(0),
        end: ranges.end(0),
      };

export default ({
  src,
  autoPlay = false,
  startPlaybackRate = 1
}) => {
  const [state, setOrgState] = useState({
    buffered: {
      start: 0,
      end: 0,
    },
    time: 0,
    duration: 0,
    paused: true,
    waiting: false,
    playbackRate: 1,
    endedCallback: null,
  });
  const setState = partState => setOrgState({ ...state, ...partState });
  const ref = useRef(null);

  const element = React.createElement(
    'audio',
    {
      src,
      controls: false,
      ref,
      onPlay: () => setState({ paused: false }),
      onPause: () => setState({ paused: true }),
      onWaiting: () => setState({ waiting: true }),
      onPlaying: () => setState({ waiting: false }),
      onEnded: state.endedCallback,
      onDurationChange: () => {
        const el = ref.current;
        if (!el) {
          return;
        }
        const { duration, buffered } = el;
        setState({
          duration,
          buffered: parseTimeRange(buffered),
        });
      },
      onTimeUpdate: () => {
        const el = ref.current;
        if (!el) {
          return;
        }
        setState({ time: el.currentTime });
      },
      onProgress: () => {
        const el = ref.current;
        if (!el) {
          return;
        }
        setState({ buffered: parseTimeRange(el.buffered) });
      },
    }
  );

  let lockPlay = false;

  const controls = {
    play: () => {
      const el = ref.current;
      if (!el) {
        return undefined;
      }

      if (!lockPlay) {
        const promise = el.play();
        const isPromise = typeof promise === 'object';

        if (isPromise) {
          lockPlay = true;
          const resetLock = () => {
            lockPlay = false;
          };
          promise.then(resetLock, resetLock);
        }

        return promise;
      }
      return undefined;
    },
    pause: () => {
      const el = ref.current;
      if (el && !lockPlay) {
        return el.pause();
      }
    },
    seek: time => {
      const el = ref.current;
      if (!el || state.duration === undefined) {
        return;
      }
      time = Math.min(state.duration, Math.max(0, time));
      el.currentTime = time || 0;
    },
    setPlaybackRate: rate => {
      const el = ref.current;
      if (!el || state.duration === undefined) {
        return;
      }

      setState({
        playbackRate: rate,
      });
      el.playbackRate = rate;
    },
    setEndedCallback: callback => {
      setState({ endedCallback: callback });
    },
  };

  useEffect(() => {
    const el = ref.current;
    setState({
      paused: el.paused,
    });

    controls.setPlaybackRate(startPlaybackRate);

    if (autoPlay && el.paused) {
      controls.play();
    }
  }, [src]);

  return { element, state, controls };
};

YTAudio is written in TypeScript. If you are using TypeScript you should use the hook I'm using there.

In the end we still need to create an HTML-Audio element that we then also need to "mount" it to the dom. But the state/controls abstractions make it really easy to interact with it:

// player.jsx
import React from 'react';
import useAudio from './useAudio';
const Player = () => {
  const { element, state, controls } = useAudio({
    src:
      'https://file-examples.com/wp-content/uploads/2017/11/file_example_MP3_2MG.mp3',
  });

  return (
    <div>
      {element}
      <button onClick={() => controls.seek(state.time - 10)}>-10 sec</button>
      <button
        onClick={() => {
          state.paused ? controls.play() : controls.pause();
        }}
      >
        {state.paused ? 'play' : 'pause'}
      </button>
      <button onClick={() => controls.seek(state.time + 10)}>+10 sec</button>
      <br />
      {Math.round(state.time)} / {Math.round(state.duration)}
      <br />
      Playback Speed (100 = 1)
      <br />
      <input
        onChange={e => controls.setPlaybackRate(e.target.value / 100)}
        type="number"
        value={state.playbackRate * 100}
      />
    </div>
  );
};

And where does the "progressive" come from?

Well, to be honest I first wanted to write one article about the whole project. But then I decided to move the "progressive" parts to their own posts. So just keep an eye on my "YTAudio"-Series here on dev.to.

The full example of my custom audio player is available on GitHub: https://github.com/nico-martin/yt-audio/tree/master/src/app/Player

Discussion (10)

pic
Editor guide
Collapse
adermanjr profile image
Aderman Jr.

Hi, great article!
Can run this project with mp3 files?

Collapse
nicomartin profile image
Nico Martin Author

Hi Aderman

Sure. Behind it you have a native HTML Audio Element. So it works with every format that is supported by your browser.

Collapse
adermanjr profile image
Aderman Jr.

Good!

And how i can pass a json list with mp3 and implement previous and next actions? And change automaticaly the next song when one finish?

Thread Thread
nicomartin profile image
Nico Martin Author

You could either load the file via an http request or you could import it at build time (I guess you're using Webpack?).
You would then need to write your own logic what should happen when the song finished. But you could use the onEndedcallback to change the audio file.

Thread Thread
adermanjr profile image
Aderman Jr.

Ok. Thank you, man!

Collapse
earvinpiamonte profile image
Noel Earvin Piamonte

Hi Nico, I would just like to ask how do you get the audio source of the YouTube videos? Thanks.

Collapse
nicomartin profile image
Nico Martin Author

Hi Noel,
I've created a little node-backend that uses npmjs.com/package/ytdl-core to extract the audio source.
My "backend" is open source as well: github.com/nico-martin/yt-audio-so...

Collapse
earvinpiamonte profile image
Noel Earvin Piamonte

All right. Thank you. I'll definitely check these.
Media Session API is so cool as well. Great posts!

Thread Thread
nicomartin profile image
Nico Martin Author

Thanks! Great to hear!

Collapse
bubbleinpit profile image
Jipeng Li • Edited

Will the hook create duplicate Audio elements while the player component rerenders?