DEV Community

Cover image for Record Audio In Blazor Hybrid App Using Plugin.Maui.Audio
Madalitso Nyemba
Madalitso Nyemba

Posted on

Record Audio In Blazor Hybrid App Using Plugin.Maui.Audio

In some project I was working on, there was a requirement to introduce a feature that allowed the application users to record an audio of themselves. Being new to the world of Blazor, I immediately started looking for what the internet had to offer with regard to the requirements at hand. I stumbled upon a great package called Plugin.Maui.Audio by @jfversluis. Of course, there were notable mentions that used interop JS to leverage the power of the browser but I did not want to go that route. I gave in after sometime but the interop approach was not working.

Looking at this package, the example given was for a .Net MAUI app. I divert. One of the reasons I fell in love with a Blazor Hybrid App, aside from the one codebase for all platforms, is the ability to utilize the powers of Blazor as well as .Net which gives you essentially a lot of power and flexibility in implementing features. Now, back to the package example. I had to sit down to tailor the example to fit in my Blazor application.

In this short article, I will share how I implemented this as well as playing the recorded audio using HTML's audio player tag.

Take note that this article assumes you have created your Blazor Hybrid app. We can start from the just-created Blazor app without any modifications.

First things first, permissions are very important since this is a Hybrid App which means it shall have to interact with different devices. You can find the detailed steps here.

Android
The AndroidManifest.xml file will need to be modified to include the following uses-permission inside the manifest tag.

<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />

Enter fullscreen mode Exit fullscreen mode

iOS
The Info.plistfile will need to be modified to include the following 2 entries inside the dict tag.

<key>NSMicrophoneUsageDescription</key>
<string>The [app name] wants to use your microphone to record audio.</string>
Enter fullscreen mode Exit fullscreen mode

Replacing [app name] with your application name.

MacCatalyst
This change is identical to the iOS section but for explicitness:

The Info.plist file will need to be modified to include the following 2 entries inside the dict tag.

<key>NSMicrophoneUsageDescription</key>
<string>The [app name] wants to use your microphone to record audio.</string>
Enter fullscreen mode Exit fullscreen mode

Replacing [app name] with your application name.

Windows
The Package.appxmanifest file will need to be modified to include the following entry inside the Capabilities tag.

<DeviceCapability Name="microphone"/>
Enter fullscreen mode Exit fullscreen mode

After that, in the root of the project, create a folder called Services. Inside this folder, create a file called AudioService.cs.
The below shall be the contents.


using Plugin.Maui.Audio;
using System.Threading.Tasks;

namespace MyBlazorHybridApp.Services
{

    public interface IAudioService
    {
        Task StartRecordingAsync();
        Task<Stream> StopRecordingAsync();
        bool IsRecording { get; }
    }

    public class AudioService : IAudioService
    {
        private IAudioManager _audioManager;
        private IAudioRecorder _audioRecorder;
        public bool IsRecording => _audioRecorder.IsRecording;
        public AudioService(IAudioManager audioManager)
        {
            _audioManager = audioManager;
            _audioRecorder = audioManager.CreateRecorder();
        }

        public async Task StartRecordingAsync()
        {
            await _audioRecorder.StartAsync();
        }

        public async Task<Stream> StopRecordingAsync()
        {
            var recordedAudio = await _audioRecorder.StopAsync();
            return recordedAudio.GetAudioStream();
        }
    }

}

Enter fullscreen mode Exit fullscreen mode

In the above code, you can see that the above class implements 2 methods namely StartRecordingAsync and StopRecordingAsync. It also implements a bool IsRecording which will be useful in our UI. Take note that the StopRecordingAsync returns a Stream. According to the docs of Maui.Plugin.Audio, the StopAsync method returns the IAudioSource instance with the recording data. I am getting the audio stream in the IAudioSource instance provided.

Move over to MauiProgram.cs and add using Plugin.Maui.Audio; on the top of the file and builder.Services.AddSingleton(AudioManager.Current); where the services are being injected in the builder before var app = builder.Build();. This shall simply register your custom service with the application so it can be used globally within the application.

In your razor file, it will look like below


//different dependecies above
@inject AudioService AudioService;

<div class="flex-column mt3">
    <div class="f5 mb2">
        Record an audio
    </div>
    <div class="f5 mb2">
        @if (!AudioService.IsRecording)
        {
            <button @onclick="StartRecording"> Start</button>
        }
        else
        {
            <button @onclick="StopRecording"> Stop</button> 
        }
    </div>
    <audio controls src="@AudioSource">
        Your browser does not support the audio element.
    </audio>


</div>

@code {
    private Stream recordedAudioStream;
    private string AudioSource { get; set; }
   public async void StartRecording()
    {
        if (await Permissions.RequestAsync<Permissions.Microphone>() != PermissionStatus.Granted)
        {
            // Inform user to request permission
        }
        else
        {
            await AudioService.StartRecordingAsync();

        }

    }
    public async void StopRecording()
    {
        recordedAudioStream =  await AudioService.StopRecordingAsync();


        var audioBytes = new byte[recordedAudioStream.Length];
        await recordedAudioStream.ReadAsync(audioBytes, 0, (int)recordedAudioStream.Length);
        var base64String = Convert.ToBase64String(audioBytes);


        AudioSource = $"data:audio/mpeg;base64,{base64String}";
        StateHasChanged();

    }
}

Enter fullscreen mode Exit fullscreen mode

The above code is for the UI. It injects the service into the UI and has buttons that will start/stop accordingly. However, when stopping, I save the IAudioSource instance with the recording data into a Stream called recordedAudioStream. I then convert the stream into a base64string so that the AudioSource can be populated with the right data to play. So, what happens is when one presses play, it starts to record and when one presses Stop, it stops recording and the previously recorded audio is loaded into the audio player where one can replay the audio. This approach helps one not build out a UI for playback of the data but can utilize what is already available like the audio tag. I hope this has helped someone.
Happy Coding 🧑‍💻.

Top comments (0)