DEV Community

Charalotte Yog
Charalotte Yog

Posted on • Updated on

How to Embed Flutter Video Call App with In-Call Stats Functionality

Flutter is a cross-platform development structure developed by Google that lets a single codebase be used for iOS and Android both. It is always better to communicate face-to-face on a video based call, than on the text messages as it leaves a better impact on the person on the far side. Therefore, the modern day applications are trying to develop next features and approaches to help users to connect with each another. In fact, the mobile apps have now evolved more with a merged voice or video calling office. Adding video call api for android capability to your existing app increases the usability of your app and increases its value.
An API provider, helps you embed your videos to mobile and web using the Flutter SDK. Applying this, you can, enable real-time audio and video calling features, advanced functionalities like screen sharing, audio mixing etc.
In this blog, we will understand how to build a video call application with in-call statistics using Flutter SDK. Flutter SDK can be installed with a plugin from pub.dev to connect people over voice, video etc.While a lot more can be done with this but let us first understand the basics.

Image description

Get Started by Creating your Project


Here is the step by step procedure

  1. Create an account by signing up into the API provider’s Dashboard. To set up this project we will work only on the main file. Then start with the running of the Flutter project.flutter create xxx_project

  2. Add the required packages to the pubspec. yaml file, after the project is created. For this, add xxx_rtc_engine, which is a wrapper (building block) for the API provider’s SDK made for Flutter.

  3. Add permission_handler. This lets users plugin their project and assists to check if camera and microphone permissions are granted. It can be used for any other permissions on the device.

Image description
Once the above said packages are added, the dependencies looks like this:

  1. dependencies:
  2. flutter:
  3. sdk: flutter
  4. xxx_rtc_engine:
  5. permission_handler:

  6. Once the providerhas introduced into the project, add the imports that you think will be required in the project. Next use the material library from Flutter, which is provided by Flutter for UI.

  7. Subsequent to this, use the async library from Dart as we may need to depend on more methods for calling provider’s SDK and to wait for a response.

  8. We would also need other imports from the API provider and the permissions handler.Three imports will be used by API provider for permissions.

  9. The first is RTC Engine, which will have maximum real time communication functions.

  10. Second is RTC Local View; it has the view of current user’s camera.

  11. Third is RTC Remote View with sight of the other user who has joined the camera.

The uppermost part of the file looks like this:

  1. import 'dart:async';

  2. import 'package:flutter/material.dart';

  3. import 'package:xxx_rtc_engine/rtc_engine.dart';

  4. import 'package:xxx_rtc_engine/rtc_local_view.dart' as RtcLocalView;

  5. import 'package:xxx_rtc_engine/rtc_remote_view.dart' as RtcRemoteView;

  6. import 'package:permission_handler/permission_handler.dart';

  7. Get the appId and the token, this helps to use API provider’s dashboard securely. Here we will use a temporary token for the current project. First, create a project, then go to edit. Look for the appId, and generate a temporary token. For your app, you need to create global variables named ‘appId’ and ‘token’ and customise them to the values saved from API provider.

Remember that this project is explained for reference and development environments and not intended for production environments. Token authentication is suggested, although not compulsory for all RTE apps that are running in production environments.

Start Constructing the App

Image description
Having worked on all the preconditions, we can now start with the actual code. To keep everything simple, we will be using one file.Start with a simple but dynamic widget layout that has ‘Scaffold’in the ‘Material App’.

void main() => runApp(HomePage());
class HomePage extends StatefulWidget {
@override
_HomePageState createState() => _HomePageState();
}
class _HomePageState extends State {
@override
void initState() {
// TODO: implement initState
super.initState();
}
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: Text("xxx Demo"),
),
),
);
}
}

The initState is where the permissions are checked and all the things necessary for the calls to start, are initialised. So initState is an important part of the app. These are the things that occur on app initialization. These can happen at any other time also when you need to use the video call SDK:
· On initialisation, initState requests permissions for the camera and the microphone, if in case it is already invited.
· Communication client is created.
· If there is any event change from the client, a listener is set up.
· Video is enabled for the existing device.
· Channel is joined.

Enabling Device PermissionRequests


Another important aspect of building a working Flutter app is to deal with the permissions. Video calls cannot be made without permission to access the microphone and the camera.

Image description

  1. await [Permission.microphone, Permission.camera].request();

We will be using the API provider’s engine to implement all of this logic within the application. To use this engine, create an instance of the engine, and initialize it using the App ID that was created earlier in the API provider’s Console. App ID is set to a variable named as ‘appId’.

  1. RtcEngine engine = await RtcEngine.create(appId);

Next one is main logic inside the app i.e. the event handler that takes care of the events that takes place on the engine. For this the command setEventHandleris called. Through this we can specify what needs to happen on the occurrence of a particular event. App providers have a big array of such events, but this being a simple app, we will use only 4 events.
joinChannelSuccess It gets activated when a current user of the app joins the channel.
userJoined This is activated in the case where a remote user joins the same channel where the current user is there.
userOffline This is activated when a remote user has left the channel that the current user is on.
rtcStats: This is activated every two seconds and it returns the statistics of the call.
For the above four methods, we will assign local variables:

  1. A boolean will inform us if the current user has successfully joined (localUserJoined)
  2. An integer is set for the user ID when the remote user joins the channel.
  3. ID is set back to null when the user leaves the channel.
  4. Local stats variable will be updated every 2 seconds to have fresh information.

engine.setEventHandler(RtcEngineEventHandler(
joinChannelSuccess: (String channel, int uid, int elapsed) {
print('$uid successfully joined channel: $channel ');
setState(() {
setState(() {
});
},
userJoined: (int uid, int elapsed) {
print('remote user $uid joined channel');
setState(() {
_remoteUid = uid;
});
},
userOffline: (int uid, UserOfflineReason reason) {
print('remote user $uid left channel');
setState(() {
_remoteUid = null;
});
},
rtcStats: (stats) {
//updates every two seconds
if (_showStats) {
_stats = stats;
setState(() {});
}
},
));

Lastly, during the initialization, we will enable video on the engine and get the current user to join the channel which will be called the firstchannel. To effectively join the channel we need to pass the token that was created earlier and stored in a variable called token.

  1. await engine.enableVideo();

  2. await engine.joinChannel(token, 'firstchannel', null, 0);

All of these functions cannot be done within the actual initState. So the initStatewould need to call another function because it cannot be non-synchronous. Therefore, another function will be created, which will be called initForXXX. In the initForXXX, we can add all the above code. The part that is outside of our build method looks like this:
bool _localUserJoined = false;
bool _showStats = false;

int _remoteUid;

RtcStats _stats = RtcStats();

@override

void initState() {

super.initState();

initForxxx();

}

Future initForxxx() async {

// retrieve permissions

await [Permission.microphone, Permission.camera].request();

// create the engine for communicating with xxx

RtcEngine engine = await RtcEngine.create(appId);

// set up event handling for the engine

engine.setEventHandler(RtcEngineEventHandler(

joinChannelSuccess: (String channel, int uid, int elapsed) {


  print('$uid successfully joined channel: $channel ');


  setState(() {


    _localUserJoined = true;


});


},


userJoined: (int uid, int elapsed) {


  print('remote user $uid joined channel');


  setState(() {


    _remoteUid = uid;


});


},


userOffline: (int uid, UserOfflineReason reason) {


  print('remote user $uid left channel');


  setState(() {


    _remoteUid = null;


});


},


rtcStats: (stats) {


  //updates every two seconds


if (_showStats) {


    _stats = stats;


    setState(() {});


}


},
Enter fullscreen mode Exit fullscreen mode

));

// enable video

await engine.enableVideo();

await engine.joinChannel(token, 'firstchannel', null, 0);

}

Applying the Codes

Image description
Next is the task of displaying the best live video call app and stats to the user and checking how all of this works collectively. UI will be kept minimalistic. Then we will display a simple Stack inside the Scaffold widget. Other users’ live cameras will be placed at the bottom level of the Stack for viewing. The top-left corner has the current user’s camera. For the current user’s video, use RtcLocalView.SurfaceView(). This is retrieved from the import we got during the setting upstage. To display remote users’ videos use RtcRemoteView.SurfaceView(uid: _remoteUid),.
Now, to present the statistics of the call, use the floatingActionButton parameter of the Scaffold. We can either display actual statistics of the call or use the “show stats” button. If the data is null a loading indicator will be shown. The next view will have a column of the stats when the data is available, and there will be a button to move out of this view.
The build function will look like below:

// Create UI with local view and remote view

@override

Widget build(BuildContext context) {

return MaterialApp(

home: Scaffold(


appBar: AppBar(


    title: const Text('Flutter example app'),


),


body: Stack(


    children: [


    Center(


        child: _renderRemoteVideo(),


    ),


    Align(


        alignment: Alignment.topLeft,


        child: Container(


        width: 100,


        height: 100,


        child: Center(


            child: _renderLocalPreview(),


        ),


        ),


    ),


    ],


),


floatingActionButton: _showStats


    ? _statsView()


    : ElevatedButton(


        onPressed: () {


            setState(() {


            _showStats = true;


            });


        },


        child: Text("Show Stats"),


        ),


),
Enter fullscreen mode Exit fullscreen mode

);

}

Widget _statsView() {

return Container(

padding: EdgeInsets.all(20),


color: Colors.white,


child: _stats.cpuAppUsage == null


    ? CircularProgressIndicator()


    : Column(


        mainAxisSize: MainAxisSize.min,


        children: [


        Text("CPU Usage: " + _stats.cpuAppUsage.toString()),


          Text("Duration (seconds): " + _stats.totalDuration.toString()),


        Text("People on call: " + _stats.users.toString()),


        ElevatedButton(


            onPressed: () {


            _showStats = false;


              _stats.cpuAppUsage = null;


            setState(() {});


            },


            child: Text("Close"),


        )


        ],


    ),
Enter fullscreen mode Exit fullscreen mode

);

}

// current user video

Widget _renderLocalPreview() {

if (_localUserJoined) {

return RtcLocalView.SurfaceView();
Enter fullscreen mode Exit fullscreen mode

} else {

return Text(


'Please join channel first',


textAlign: TextAlign.center,


);
Enter fullscreen mode Exit fullscreen mode

}

}

// remote user video

Widget _renderRemoteVideo() {

if (_remoteUid != null) {

return RtcRemoteView.SurfaceView(uid: _remoteUid);
Enter fullscreen mode Exit fullscreen mode

} else {

return Text(


'Please wait remote user join',


textAlign: TextAlign.center,


);
Enter fullscreen mode Exit fullscreen mode

}

}

This was the complete working video call app written in Flutter code

Takeaway




API provider’s Flutter plug-ins have made the application in-call statistics video calls much easier in the Flutter app. The above blog helps you build a functional video call app with in-call stats. This can also be used to add to your existing apps.

Top comments (0)