DEV Community

Cover image for How to Create an Audio Visualizer using Nextjs
Favour Onyeke
Favour Onyeke

Posted on

How to Create an Audio Visualizer using Nextjs

An audio visualizer is a graphical representation of audio frequencies and amplitudes on a display. It adds a visually appealing element to audio playback, animating and synchronizing visual elements with the audio being played. This can enhance the user experience by providing a visually immersive and interactive interface. In this article, we will develop a simple Next.js application that illustrates how to use the Web Audio API to capture audio data from a source and display visual representations on the screen.

Why is an Audio Visualizer necessary

An audio visualizer is an essential feature that can make a website more exciting and interactive. Imagine listening to music or audio and seeing the sound come to life with colorful visuals and animations. It takes your browsing experience to a whole new level!

Also, Adding an audio visualizer to a site helps you better understand and connect with the audio content. Think about it — when you can see the visual representation of the sound, it becomes much easier to grasp its meaning.

How does an Audio visualizer work

So, you’re probably aware that when you’re jamming to your favorite tunes online, your browser is doing some heavy lifting behind the scenes to make it all happen. That’s where the Web Audio API comes into play. This powerful tool allows developers to create and manipulate audio graphs, which represent the flow of audio data through various nodes. Each node has a specific task, such as reading audio input from a file or microphone, applying effects like reverb or distortion, or sending the audio to your speakers or headphones.

But the Web Audio API doesn't stop at audio processing. It also gives developers precise control over timing and synchronization, allowing them to fine-tune the timing and playback of audio for a seamless and immersive experience.

Terms Associated with the Web Audio API

The Web Audio API offers a flexible system for managing audio on the web. It has a variety of methods that enable developers to perform various audio effects.

In this article, we'll explore three essential terms in the Web Audio API for our project. For more in-depth information, visit MDN Docs.

AudioContext: It's where you manage all the audio stuff, like creating sounds, adding effects, and controlling how they're played. Think of it as a dashboard where you connect different audio parts to make your sounds come to life on the web.

AnalyserNode: An AnalyserNode is a type of audio node that performs real-time analysis of audio data. It allows developers to extract frequency and time-domain data from audio signals.

SourceNode: It is a type of audio nodein the Web Audio API that represents an audio source, such as an audio file, microphone input, or synthesized sound.

SourceNodes are created using methods provided by the AudioContext as the createMediaElementSource()

Getting started

Let’s start by creating a new Next.js project by running the following command in your terminal.



npx create-next-app Audio-App


Enter fullscreen mode Exit fullscreen mode

Don't forget to include Tailwind CSS as well, which will help you style your project with ease.

At the page.js, rename the Home component to AudioApp. Your page.js app should look like this:



"use client";
import React, { useState, useRef } from "react";

function AudioApp() {
  return <main></main>;
}

export default AudioApp;


Enter fullscreen mode Exit fullscreen mode

Now, you have created a new Next.js project named Audio-App, installed Tailwind CSS, and renamed the default page with the Home component to AudioApp.

Styling of the app

To display an audio visualizer, an audio file is required. One way to do this is to add a form to your AudioApp.js component that allows users to upload their own music.

I found a helpful template from Flowbite that we can use: Flowbite File Input.



"use client";
import React from "react";

function AudioApp() {
  return (
    <main>
      <div className="h-screen font-sans text-gray-900 bg-gray-300 border-box">
        <div className="flex justify-center w-full mx-auto sm:max-w-lg">
          <div className="flex flex-col items-center justify-center w-full h-auto my-20 bg-white sm:w-3/4 sm:rounded-lg sm:shadow-xl">
            <div className="mt-10 mb-10 text-center">
              <h2 className="text-2xl font-semibold mb-2">Upload your music</h2>
              <p className="text-xs text-gray-500">File should be of mp3 format</p>
            </div>
            <form
              action="#"
              className="relative w-4/5 h-32 max-w-xs mb-10 bg-white bg-gray-100 rounded-lg shadow-inner"
            >
              <input type="file" id="file-upload" className="hidden" />
              <label
                htmlFor="file-upload"
                className="z-20 flex flex-col-reverse items-center justify-center w-full h-full cursor-pointer"
              >
                <p className="z-10 text-xs font-light text-center text-gray-500">
                  Upload your music here
                </p>
                <svg
                  className="z-10 w-8 h-8 text-indigo-400"
                  fill="currentColor"
                  viewBox="0 0 20 20"
                  xmlns="http://www.w3.org/2000/svg"
                >
                  <path d="M2 6a2 2 0 012-2h5l2 2h5a2 2 0 012 2v6a2 2 0 01-2 2H4a2 2 0 01-2-2V6z"></path>
                </svg>
              </label>
            </form>
          </div>
        </div>
      </div>
    </main>
  );
}

export default AudioApp;


Enter fullscreen mode Exit fullscreen mode

This will create a styled form for media upload. To display the audio player and audio visualizer, we will add this styled music card from Tailwindcomponents below the form.



"use client";
import React, { useState, useRef } from "react";

function AudioApp() {

  return (
    <main>
      <div className="h-screen font-sans text-gray-900 bg-gray-300 border-box">
        <div className="flex justify-center w-full mx-auto sm:max-w-lg">
          <div className="flex flex-col items-center justify-center w-full h-auto my-20 bg-white sm:w-3/4 sm:rounded-lg sm:shadow-xl">
            <div className="mt-10 mb-10 text-center">
              <h2 className="text-2xl font-semibold mb-2">Upload your music</h2>
              <p className="text-xs text-gray-500">File should be of mp3 format</p>
            </div>
            <form
              action="#"
              className="relative w-4/5 h-32 max-w-xs mb-10 bg-white bg-gray-100 rounded-lg shadow-inner"
            >
              <input type="file" id="file-upload" className="hidden" />
              <label
                htmlFor="file-upload"
                className="z-20 flex flex-col-reverse items-center justify-center w-full h-full cursor-pointer"
              >
                <p className="z-10 text-xs font-light text-center text-gray-500">
                  Upload your music here
                </p>
                <svg
                  className="z-10 w-8 h-8 text-indigo-400"
                  fill="currentColor"
                  viewBox="0 0 20 20"
                  xmlns="http://www.w3.org/2000/svg"
                >
                  <path d="M2 6a2 2 0 012-2h5l2 2h5a2 2 0 012 2v6a2 2 0 01-2 2H4a2 2 0 01-2-2V6z"></path>
                </svg>
              </label>
            </form>
          </div>
        </div>
      </div>

      <main className="flex min-h-screen w-full items-center justify-center">
        <article className="group relative flex h-[12rem] w-[50rem] overflow-hidden rounded-2xl bg-[#3a4448]">
          {/* <!-- image (left side) --> */}
          <div className="absolute inset-y-0 left-0 w-48">
            <img
              src="https://images.unsplash.com/photo-1487180144351-b8472da7d491?w=600&auto=format&fit=crop&q=60&ixlib=rb-4.0.3&ixid=M3wxMjA3fDB8MHxzZWFyY2h8MTh8fG11c2ljfGVufDB8fDB8fHww"
              alt=""
              className="h-full w-full object-cover object-center opacity-95"
            />

            <div className="invisible absolute inset-0 flex h-full w-full items-center justify-center bg-[#0c0c0c]/70 opacity-0 transition-all group-hover:visible group-hover:opacity-100"></div>
          </div>

          <div className="absolute inset-y-0 left-44 w-[39rem] overflow-hidden rounded-2xl transition-all group-hover:w-[36rem]">
            <div className="h-full w-full bg-cover bg-center">
              <img
                src="https://images.unsplash.com/photo-1487180144351-b8472da7d491?w=600&auto=format&fit=crop&q=60&ixlib=rb-4.0.3&ixid=M3wxMjA3fDB8MHxzZWFyY2h8MTh8fG11c2ljfGVufDB8fDB8fHww"
                alt=""
                srcset=""
              />
            </div>
            <div className="h-full w-full bg-[#455055]/80 transition-all group-hover:bg-[#31383b]/80">
            </div>
          </div>

          <section className="absolute inset-0 flex flex-col justify-between p-4 text-white">
            <header className="space-y-1">
              <div className="text-3xl font-medium">Audio Visualizer</div>
            </header>
            <div className="flex space-x-2">
              <div className="flex items-center space-x-1">
                <div>
                  <audio />
                  <button>
                    <svg
                      className="h-w-14 w-14 cursor-pointer text-white transition-all hover:text-yellow-400"
                      fill="currentColor"
                      viewBox="0 0 20 20"
                      xmlns="http://www.w3.org/2000/svg"
                    >
                      <path
                        fillRule="evenodd"
                        d="M10 18a8 8 0 100-16 8 8 0 000 16zM9.555 7.168A1 1 0 008 8v4a1 1 0 001.555.832l3-2a1 1 0 000-1.664l-3-2z"
                        clipRule="evenodd"
                      ></path>
                    </svg>
                  </button>
                </div>
                <canvas width={500} height={200} />

                <span className="h-5 w-2 rounded-full bg-red-500"></span>

                <span className="h-5 w-2 rounded-full bg-green-500"></span>
                <span className="h-5 w-2 rounded-full bg-yellow-500"></span>
              </div>
            </div>
          </section>
        </article>
      </main>
    </main>
  );
}

export default AudioApp;


Enter fullscreen mode Exit fullscreen mode

From the code, we managed to generate a rectangular card with a play button (images sourced from Unsplash). Two essential elements in the code above for the app's effective operation are the <audio> element and the <canvas> element.

The <audio> element is utilized to embed sound content within the HTML document. The <canvas> element is employed for drawing graphics, animations, or other visual content on a web page.

We need a condition to display the form input tag and the audio player. If the user has uploaded a file, we show the audio player. Otherwise, we show the file input form.

Initialization and State Setup for the Audio Player

We need to establish the initial conditions and manage the state required for the functionality of an audio player. This involves tasks like initializing the audio player and setting up state variables to track the audio file.



const [files, setFiles] = useState(null); // to store the audio file
const [uploaded, setUploaded] = useState(false); // this constraint is to determine if the audio file is available
const canvasRef = useRef(null); // to target the canvas 
const audioRef = useRef(null); // store the audio data
const source = useRef(null);
const analyserRef = useRef(null);


Enter fullscreen mode Exit fullscreen mode

Since useRef hooks create references to essential elements and nodes, we utilize them in manipulating our app.

canvasRef: This useRef hook creates a reference to a canvas element. It allows us to access and manipulate the <canvas> element's properties and methods within the React.js component. In this case, it's used to draw the visualizations of the audio data on the canvas.

audioRef: Similarly, this useRef hook creates a reference to an audio element. It enables us to access and control the audio playback, such as playing, pausing, or seeking, within the React component. In this case, it's used to play the audio file uploaded by the user.

source: This useRef hook creates a reference to a Web Audio API source node. In Web Audio API usage, a source node represents the audio source, such as an <audio> element or other audio sources. This reference may be used to connect the audio source to an audio context and perform operations like analyzing audio data.

analyserRef: This useRef hook creates a reference to an AnalyserNode, which is part of the Web Audio API. AnalyserNode is used for analyzing the audio data in real-time, such as extracting frequency or time-domain data. By creating a reference to the AnalyserNode, we can access and manipulate its properties and methods within the React component.

To hide the audio player when no audio file is selected, we use the setUploaded function.



{!uploaded && (
  <div class="h-screen font-sans text-gray-900 bg-gray-300 border-box">
    <div class="flex justify-center w-full mx-auto sm:max-w-lg">
      <div class="flex flex-col items-center justify-center w-full h-auto my-20 bg-white sm:w-3/4 sm:rounded-lg sm:shadow-xl">
        <div class="mt-10 mb-10 text-center">
          <h2 class="text-2xl font-semibold mb-2">Upload your files</h2>
          <p class="text-xs text-gray-500">
            File should be of format .mp4, .avi, .mov or .mkv
          </p>
        </div>
        <form
          action="#"
          class="relative w-4/5 h-32 max-w-xs mb-10 bg-white bg-gray-100 rounded-lg shadow-inner"
        >
          <input
            type="file"
            id="file-upload"
            class="hidden"
            onChange={handleChange}
          />
          <label
            for="file-upload"
            class="z-20 flex flex-col-reverse items-center justify-center w-full h-full cursor-pointer"
          >
            <p class="z-10 text-xs font-light text-center text-gray-500">
              Upload your music here
            </p>
            <svg
              class="z-10 w-8 h-8 text-indigo-400"
              fill="currentColor"
              viewBox="0 0 20 20"
              xmlns="http://www.w3.org/2000/svg"
            >
              <path d="M2 6a2 2 0 012-2h5l2 2h5a2 2 0 012 2v6a2 2 0 01-2 2H4a2 2 0 01-2-2V6z"></path>
            </svg>
          </label>
        </form>
      </div>
    </div>
  </div>
)}


Enter fullscreen mode Exit fullscreen mode

In the input tag, we set an onChange attribute that listens for changes in the form. We pass a function called handleChange to handle this event.

Next, we create a handleChange function that handles the uploaded audio file.



const handleChange = (e) => {
  setFiles(e.target.files[0]);
  setUploaded(true);
};


Enter fullscreen mode Exit fullscreen mode

Now that the audio file is being saved with setFiles, we can include a condition that enables the audio player to start playing when a file is present.



{/* check if the file is uploaded successfully */}
{files && (
    <main className="flex min-h-screen w-full items-center justify-center">
        <article className="group relative flex h-[12rem] w-[50rem] overflow-hidden rounded-2xl bg-[#3a4448]">
            <div className="absolute inset-y-0 left-0 w-48">
                <img
                    src="https://unsplash.it/id/1/640/425"
                    alt=""
                    className="h-full w-full object-cover object-center opacity-95"
                />

                <div className="invisible absolute inset-0 flex h-full w-full items-center justify-center bg-[#0c0c0c]/70 opacity-0 transition-all group-hover:visible group-hover:opacity-100"></div>
            </div>
            <div className="absolute inset-y-0 left-44 w-[39rem] overflow-hidden rounded-2xl transition-all group-hover:w-[36rem]">
                <div className="h-full w-full bg-cover bg-center">
                    <img src="https://unsplash.it/id/1/640/425" alt="" srcset="" />
                </div>
                <div className="h-full w-full bg-[#455055]/80 transition-all group-hover:bg-[#31383b]/80">
                    {" "}
                </div>
            </div>

            <section className="absolute inset-0 flex flex-col justify-between p-4 text-white">
                <header className="space-y-1">
                    <div className="text-3xl font-medium">Audio Visualizer</div>
                </header>
                <div className="flex space-x-2">
                    <div className="flex items-center space-x-1">

                        <div>
                            <audio ref={audioRef} src={window.URL.createObjectURL(files)} />
                            <button onClick={handleAudioPlay}>
                                <svg
                                    className="h-w-14 w-14 cursor-pointer text-white transition-all hover:text-yellow-400"
                                    fill="currentColor"
                                    viewBox="0 0 20 20"
                                    xmlns="http://www.w3.org/2000/svg"
                                >
                                    <path
                                        fillRule="evenodd"
                                        d="M10 18a8 8 0 100-16 8 8 0 000 16zM9.555 7.168A1 1 0 008 8v4a1 1 0 001.555.832l3-2a1 1 0 000-1.664l-3-2z"
                                        clip-rule="evenodd"
                                    ></path>
                                </svg>
                            </button>
                        </div>
                        <canvas ref={canvasRef} width={500} height={200} />

                        <span className="h-5 w-2 rounded-full bg-red-500"></span>

                        <span className="h-5 w-2 rounded-full bg-green-500"></span>
                        <span className="h-5 w-2 rounded-full bg-yellow-500"></span>
                    </div>
                </div>
            </section>
        </article>
    </main>
)}


Enter fullscreen mode Exit fullscreen mode

From the code below, we passed the canvasRef to the canvas tag and audioRef to the audio element. The window.URL.createObjectURL generates a URL that can be used to reference the content of the selected file (named files) within the browser.

We created a button and passed a handleAudioPlay function, which plays the audio.

Next, we create the handleAudioPlay function.



const handleAudioPlay = () => {
  const audioElement = audioRef.current;
  if (audioElement && audioElement.readyState >= 2) {
    // Check if the audio element is loaded and ready to play
    audioElement.play();
    const audioContext = new AudioContext();
    if (!source.current) {
      source.current = audioContext.createMediaElementSource(audioElement);
      const analyser = audioContext.createAnalyser();
      analyserRef.current = analyser;
      source.current.connect(analyser);
      analyser.connect(audioContext.destination);
    }
  }
};


Enter fullscreen mode Exit fullscreen mode

Visualization of the Audio Data

We create a visualizeAudio function responsible for rendering the audio data visualization on the <canvas> element and then pass it to the handleAudioPlay.



function visualizeAudio() {
  const canvasContext = canvasRef.current.getContext("2d");

  const renderFrame = () => {
    const frequencyData = new Uint8Array(analyserRef.current.frequencyBinCount);
    analyserRef.current.getByteFrequencyData(frequencyData);

    const barWidth = 5; // bar width
    let startX = 0;

    canvasContext.clearRect(
      0,
      0,
      canvasRef.current.width,
      canvasRef.current.height
    );

    for (let i = 0; i < frequencyData.length; i++) {
      startX = i * 8;

      const gradient = canvasContext.createLinearGradient(
        0,
        0,
        canvasRef.current.width,
        canvasRef.current.height
      );
      gradient.addColorStop(0.2, "#ff0000"); // Red
      gradient.addColorStop(0.5, "#00ff00"); // Green
      gradient.addColorStop(1.0, "#0000ff"); // Blue

      canvasContext.fillStyle = gradient;
      canvasContext.fillRect(
        startX,
        canvasRef.current.height,
        barWidth,
        -frequencyData[i] // bar height
      );
    }

    requestAnimationFrame(renderFrame); // Call renderFrame recursively
  };

  renderFrame(); // Start the rendering loop
}


Enter fullscreen mode Exit fullscreen mode

We first retrieve the 2D rendering context of the canvas using the getContext(), which enables drawing shapes, text, and images onto the canvas. This method allows us to create dynamic visualizations based on the audio data. Within the renderFrame function, the core visualization logic unfolds. It continuously updates the visualization based on the audio data in real-time using the requestAnimationFrame. Within the loop, bars are drawn on the canvas to represent the intensity of each frequency band. Each bar's height corresponds to the frequency's intensity, creating a visual representation of the audio's spectral content. Before rendering each new frame, the canvas is cleared to remove any remnants of the previous visualization using theclearRect(). This ensures that the visualization remains clean and does not overlap or distort over time.

This code is then passed to the handleAudioPlay function as visualizeAudio.



const handleAudioPlay = () => {
  const audioElement = audioRef.current;
  if (audioElement && audioElement.readyState >= 2) {
    // Check if the audio element is loaded and ready to play
    audioElement.play();
    const audioContext = new AudioContext();
    if (!source.current) {
      source.current = audioContext.createMediaElementSource(audioElement);
      const analyser = audioContext.createAnalyser();
      analyserRef.current = analyser;
      source.current.connect(analyser);
      analyser.connect(audioContext.destination);
    }
    visualizeAudio(); // visualize audio
  }
};


Enter fullscreen mode Exit fullscreen mode

Upon launching our application at http://localhost:3000/, the project becomes visible. When an audio file is uploaded, the input form disappears and is replaced by an audio player. The visualizer emerges upon clicking the play button.

Audio Visualizer

Conclusion

In conclusion, audio visualizers serve as captivating features that elevate website interactivity and user engagement. By synchronizing visual elements with audio playback, they offer users a visually immersive experience while enhancing their understanding and connection with audio content. We successfully created a basic audio visualizer with the assistance of the web audio API. The possibilities of the Web Audio API are vast, allowing for complex and engaging audio experiences tailored to specific projects.

Top comments (2)

Collapse
 
brian_stoker_d1b78be185dd profile image
Brian Stoker

i can has github repo.. #lazy <3

Collapse
 
hnkomuwa profile image
Harrison

Well done favour