DEV Community

Cover image for Visualizing Audio as a Waveform in React
Sanjay 🥷
Sanjay 🥷

Posted on

Visualizing Audio as a Waveform in React

Audio visualization is a great way to add a new dimension to music, podcasts, or any other audio content. A common way to visualize audio is to display its waveform, which shows the amplitude of the sound wave at each point in time. In this tutorial, we'll learn how to generate a waveform for an audio file in React without using any libraries.

Demo

Prerequisites

Before we get started, make sure you have the following installed on your machine:

  • Node.js and npm (or yarn)

Setting up the Project

Let's start by creating a new React project. Open your terminal and run the following commands:

npx create-react-app audio-visualizer
cd audio-visualizer
npm install
Enter fullscreen mode Exit fullscreen mode

Analyzing Audio Data

To create a waveform, we need to analyze the audio data. We can do this using the Web Audio API, which provides an AnalyserNode object that can be used to analyze the frequency and time-domain data of an audio source.

In our App.js file, we create an audioAnalyzer function that creates an AnalyserNode object and connects it to our audio source. We also create a dataArray to store the frequency data that we will use to draw the waveform.

// App.js

import { useRef, useState } from "react";
import "./styles.css";
import WaveForm from "./WaveForm";

export default function App() {
  const [audioUrl, setAudioUrl] = useState();
  const [analyzerData, setAnalyzerData] = useState(null);
  const audioElmRef = useRef(null);

  // audioAnalyzer function analyzes the audio and sets the analyzerData state
  const audioAnalyzer = () => {
    // create a new AudioContext
    const audioCtx = new (window.AudioContext || window.webkitAudioContext)();
    // create an analyzer node with a buffer size of 2048
    const analyzer = audioCtx.createAnalyser();
    analyzer.fftSize = 2048;

    const bufferLength = analyzer.frequencyBinCount;
    const dataArray = new Uint8Array(bufferLength);
    const source = audioCtx.createMediaElementSource(audioElmRef.current);
    source.connect(analyzer);
    source.connect(audioCtx.destination);
    source.onended = () => {
      source.disconnect();
    };

    // set the analyzerData state with the analyzer, bufferLength, and dataArray
    setAnalyzerData({ analyzer, bufferLength, dataArray });
  };

  // onFileChange function handles the file input and triggers the audio analysis
  const onFileChange = (e) => {
    const file = e.target.files?.[0];
    if (!file) return;
    setAudioUrl(URL.createObjectURL(file));
    audioAnalyzer();
  };

  return (
    <div className="App">
      <h1>Audio Visualizer</h1>
      {analyzerData && <WaveForm analyzerData={analyzerData} />}
      <div
        style={{
          height: 80,
          display: "flex",
          justifyContent: "space-around",
          alignItems: "center"
        }}
      >
        <input type="file" accept="audio/*" onChange={onFileChange} />
        <audio src={audioUrl ?? ""} controls ref={audioElmRef} />
      </div>
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

Drawing the Waveform

Next, we will create a WaveForm component that takes the analyzerData as a prop and renders a canvas element. We use the useSize hook to set the size of the canvas based on the window dimensions.

// useSize.js

import { useCallback, useEffect, useState } from 'react';

// custom hook to get the width and height of the browser window
const useSize = () => {
  // initialize width and height to 0
  const [width, setWidth] = useState(0);
  const [height, setHeight] = useState(0);

  // setSizes callback function to update width and height with current window dimensions
  const setSizes = useCallback(() => {
    setWidth(window.innerWidth);
    setHeight(window.innerHeight);
  }, [setWidth, setHeight]);

  // add event listener for window resize and call setSizes
  useEffect(() => {
    window.addEventListener('resize', setSizes);
    setSizes();
    return () => window.removeEventListener('resize', setSizes);
  }, [setSizes]);

  // return width and height
  return [width, height];
};

export default useSize;
Enter fullscreen mode Exit fullscreen mode

In the WaveForm component, we create an animateBars function that uses the AnalyserNode to get the frequency data and draws the bars of the waveform. We then create a draw function that sets up the canvas context and calls animateBars on each animation frame.


// This function takes in the audio data, analyzes it, and generates a waveform
// that is visualized on a canvas element.
function animateBars(analyser, canvas, canvasCtx, dataArray, bufferLength) {
  // Analyze the audio data using the Web Audio API's `getByteFrequencyData` method.
  analyser.getByteFrequencyData(dataArray);

  // Set the canvas fill style to black.
  canvasCtx.fillStyle = '#000';

  // Calculate the height of the canvas.
  const HEIGHT = canvas.height / 2;

  // Calculate the width of each bar in the waveform based on the canvas width and the buffer length.
  var barWidth = Math.ceil(canvas.width / bufferLength) * 2.5;

  // Initialize variables for the bar height and x-position.
  let barHeight;
  let x = 0;

  // Loop through each element in the `dataArray`.
  for (var i = 0; i < bufferLength; i++) {
    // Calculate the height of the current bar based on the audio data and the canvas height.
    barHeight = (dataArray[i] / 255) * HEIGHT;

    // Generate random RGB values for each bar.
    const maximum = 10;
    const minimum = -10;
    var r = 242 + Math.floor(Math.random() * (maximum - minimum + 1)) + minimum;
    var g = 104 + Math.floor(Math.random() * (maximum - minimum + 1)) + minimum;
    var b = 65 + Math.floor(Math.random() * (maximum - minimum + 1)) + minimum;

    // Set the canvas fill style to the random RGB values.
    canvasCtx.fillStyle = 'rgb(' + r + ',' + g + ',' + b + ')';

    // Draw the bar on the canvas at the current x-position and with the calculated height and width.
    canvasCtx.fillRect(x, HEIGHT - barHeight, barWidth, barHeight);

    // Update the x-position for the next bar.
    x += barWidth + 1;
  }
}

Enter fullscreen mode Exit fullscreen mode

The animateBars function is responsible for generating a waveform based on the audio data passed in. It uses the Web Audio API's getByteFrequencyData method to analyze the audio data and then loops through each element in the dataArray to generate a bar for each frequency. The height of each bar is calculated based on the audio data and the canvas height, and the width of each bar is calculated based on the canvas width and the buffer length. The function also generates random RGB values for each bar and sets the canvas fill style accordingly before drawing each bar on the canvas.


// WaveForm.jsx

import { useRef, useEffect } from "react";

// Function to animate the bars
function animateBars(analyser, canvas, canvasCtx, dataArray, bufferLength) {
  // ...
}

// Component to render the waveform
const WaveForm = ({ analyzerData }) => {
  // Ref for the canvas element
  const canvasRef = useRef(null);
  const { dataArray, analyzer, bufferLength } = analyzerData;

  // Function to draw the waveform
  const draw = (dataArray, analyzer, bufferLength) => {
    const canvas = canvasRef.current;
    if (!canvas || !analyzer) return;
    const canvasCtx = canvas.getContext("2d");

    const animate = () => {
      requestAnimationFrame(animate);
      canvas.width = canvas.width;
      animateBars(analyzer, canvas, canvasCtx, dataArray, bufferLength);
    };

    animate();
  };

  // Effect to draw the waveform on mount and update
  useEffect(() => {
    draw(dataArray, analyzer, bufferLength);
  }, [dataArray, analyzer, bufferLength]);

  // Return the canvas element
  return (
    <canvas
      style={{
        position: "absolute",
        top: "0",
        left: "0",
        zIndex: "-10"
      }}
      ref={canvasRef}
      width={window.innerWidth}
      height={window.innerHeight}
    />
  );
};

export default WaveForm;
Enter fullscreen mode Exit fullscreen mode

Conclusion

We have explored the process of visualizing an audio waveform in a React application without the use of any external library. We began by analyzing the audio using the Web Audio API and extracting the frequency data using an AnalyserNode. We then used the Canvas API to draw the waveform on a canvas element. Finally, we implemented a function to animate the bars of the waveform by generating random colors and heights for each bar. This project is a great example of the power and versatility of React and the Web Audio API.

To see the full code for this project, please visit Codesandbox here. Thank you for reading!

Top comments (3)

Collapse
 
thomasspare profile image
Spare

Awesome ! I am Looking for a way to create a static waveforms of uploaded wav files i react-native.

Collapse
 
kaizoku04 profile image
Mohannad Rababah

Same

Collapse
 
avahajr profile image
Ava Hajratwala

Really awesome visualization! I'm hoping to use it for an internet radio website I'm building. However, I can't seem to get it working on Safari. There are no errors thrown -- the canvas just stays blank. Have you run into this before? Any idea how to fix?