DEV Community

Doug Sillars for api.video

Posted on • Originally published at api.video on

Sharing a Video: Sending a Video Via Livestream

Video on demand (VOD) is a great way to give your customers a way to watch videos when they want to watch them. But what if you want that recorded video to be played at a specific time?

  • Maybe you are a teacher, and you want your pre-recorded lecture to broadcast at 8 AM for your class to watch (but you don’t actually need to be there at 8 AM).
  • A “replay event” of a live stream that occurs 6 hours later for viewers in other parts of the world.
  • A “video share” party where a group of people can share watching a video. This is not possible with VOD, but is possible with a video livestream. In this post, we'll walk through the steps required to convert a recorded video into a video livestream for scheduled playback.

Broadcasting a VOD

There are 3 things you’ll need to do a basic VOD broadcast: 1. A livestream: When you livestream with api.video, you’ll need to create a livestream that will be used to broadcast your video. Once you’ve created a livestream for the video, you’ll need the parameters from the response - namely the streamKey and the iframe/player urls for your viewers to see the livestream.

  1. A recorded video. You’ll want to have a video that you wish to send over the livestream. This can be a video local to your computer, but in this tutorial, it will be a file that has been uploaded to api.video. We’ll use the mp4 url for the streaming.

  2. FFMPEG You’ll also need FFMPEG installed on your computer. If you use a Mac, you can install via HomeBrew. FFMPEG is the tool that will help you take your VOD and convert it into a broadcast. Ok, Now you have everything you need. In a terminal window on your computer, we’ll run a FFMPEG command:

ffmpeg -i <your video to be streamed> -preset ultrafast -tune zerolatency -f flv rtmp://broadcast.api.video/s/<livestream streamkey>
Enter fullscreen mode Exit fullscreen mode

Ffmpeg will take an input (-i) if your video, and using an ultrafast and zerolatency transcoding, create a fly stream, and send it to the api.video RTMP stream.

FFMPEG will start running, and spit out a lot of code into the terminal.. Here you can see that the video is being encoded and sent on.

That’s literally all you have to do to make this work.

I have an 8 AM class, UGH!

Ok, what if you could automate this command to run right at 8am? It is easy to do!

Create a Bash script with your command:

#!/bin/bash

ffmpeg -i class_recording.mp4 -preset ultrafast -tune zerolatency -f flv rtmp://broadcast.api.video/s/<livestream streamkey>
Enter fullscreen mode Exit fullscreen mode

Then we’ll use crontab (mac and Linux) to automate the command. Let's say class is Feb 10, at 8AM.

Running

crontab -e
Enter fullscreen mode Exit fullscreen mode

Opens the cronjob queue (probably in VIM). Type ‘i’ to begin typing, and enter:

00 08 10 02 * <path to your script>
Enter fullscreen mode Exit fullscreen mode

This will run the script at 8 AM on February 10. To exit VIM, type , ‘:wq” to save the file. Now - as long as your computer is on at 8AM on the 10th, the video will stream for you!

Looping a video in a livestream

Do you want to have a video on continuous loop? With api.video, you can do this with just our video player:

https://embed.api.video/vod/vi6xuvHolHxZQ5r6KETXAiR4#autoplay;muted;loop
Enter fullscreen mode Exit fullscreen mode

Appending the autoplay, muted and loop parameters to the player url tells the api.video player to do all of these actions - the video will continuously playback for your users.

But if you want to have the loop in a livestream, this is possible as well:

ffmpeg -stream_loop <number of loops> -i https://cdn.api.video/vod/vi1UQBDAMqAPCRxB3dmw1thc/mp4/1080/source.mp4 -preset ultrafast -tune zerolatency -f flv rtmp://broadcast.api.video/s/1d1e7a11-14a6-4984-b6d4-0c9864aec3dd
Enter fullscreen mode Exit fullscreen mode

Simply insert the -stream_loop in the beginning of your ffmpeg command, and add the number of additional plays afterward (-1 is an infinite loop).

Sharing a movie

Ok - since we can have a livestream - we can now “share” a movie with others. Note: the videos are not perfectly synced (a work in progress), but we can all watch the same video - at approximately the same location for each of us.

We’ve sort of covered this idea with a class lecture - you might imagine all the students will be online at 8AM to watch the class. But what if we wanted to do something fun?

we've built share.a.video, a demo app running NodeJS, that replicates the FFMPEG transcoding shown above, but on your remote server - and then gives a playback view where anyone with the url can watch the video. In the example, we use 'Big Buck Bunny' and `Sita Sings the Blues' (both Creative Commons licensed videos). The code is open sourced on Github

On page load - we look to see if the livestream is already broadcasting:

On the Node server, we have some of the video data hardcoded. To find out if the video is playing, we call the Livestream endpoint, and match the livestreamIds - what we are really interested in is the broadcasting parameter:

`
//get data on both movies:
const client = new apiVideo.Client({ apiKey: apiVideoKey });
let allLiveStreams = client.lives.search();
var videos =[{
"name": "Big Buck Bunny",
"livestream": "li6ndv3lbvrZELWxMKGzGg9V",
"broadcasting":false,
"iframe":"",
"thumbnail":"",
"description": "Big Buck Bunny is a free and open source movie, created by Blender, and released under Creative Commons 3.0."

    },{
        "name": "Sita Sings the Blues",
        "livestream": "li7e2ePBRYKY6AOfPU8HSt91",
        "broadcasting":false,
        "iframe":"",
        "thumbnail":""
        ,
    "description": "Sita Sings the Blues is a an open source movie, created by Nina Paley, and released under CC-BY-SA."
    }];


allLiveStreams.then(function(liveList){
    //console.log(liveList);
    for(var i=0;i<livelist.length i iframe thumbnail and broadcasting status for each video if videos livelist src="%22%20+liveList%5Bi%5D.assets.player%20+%22" width='\"100%\"' height='\"100%\"' frameborder='\"0\"' scrolling='\"no\"' allowfullscreen='\"true\"";' console.log return res.render we the json array of data to web application. application is built with pug can use logic from variable decide what displayed: h1 movie has already started. click image enter theatre. p h2 make sure your phone silenced. else begin livestream. it take a few seconds buffer up then get you into img.image playing theatre display page that lets viewers playback. invite start when livestream call endpoint which will being started ffmpeg commands those above do vod-> live transcoding on the server:

                                                                 ```
                                                                    //ok this will kick the video stream off
    console.log(req.body);
    var videoToStream = req.body.movie;
    //counter for array data BBB=0, SSTB =1 (more will go from here)
    var counter = 0;
    if(videoToStream==="sstb"){
        counter =1;
    }   
    console.log("video to stream:",videoToStream );
    var videoLink = videoUrls[counter];
    var rtmpDestination = "rtmp://broadcast.api.video/s/"+streamKeys[counter];

    var ops = [
        '-i', videoLink, 
          '-preset', 'ultrafast', '-tune', 'zerolatency', 
            '-f', 'flv', rtmpDestination        
    ];

    console.log("ops", ops);
    ffmpeg_process=spawn('ffmpeg', ops);
    //ffmpeg started
    console.log("video stream started");

    ffmpeg_process.stderr.on('data',function(d){
        console.log('ffmpeg_stderr',''+d);
    });

    //
     res.sendStatus(200);
                                                                 ```
Enter fullscreen mode Exit fullscreen mode

When the stream starts, we head back to the client side of the app for another cool trick.

A livestream requires about 15 seconds of video to be transcoded before it is live - so if we opened the livestream URL right away, viewers would get an error. So, we play a little trick when we get the response from the server:

```
 oReq.onload = function (oEvent) {
        console.log("video started: ",movieUpload );
        var livestreamid = document.getElementById("videoDiv").innerHTML;
        document.getElementById("videoDiv").innerHTML="";
        console.log("liveid", livestreamid);
        var videos = ['vi74TmfoJyPmVJVnIl4jzMLA', livestreamid];

        //now we create the player
        //since the livestream can take some time to start - we'll kick off with the 10 second countdown video
        var counter = 0;
        createVideo(counter);
        document.getElementsByClassName('image')[0].style.display= 'none';
        document.getElementById('thumb-live-indicator').className = "active";

        //code lifted from playlist demo
        function createVideo(counter) {
            console.log("video", counter +videos[counter]);

            var vodOptions = { 
                id: videos[counter], 
                 autoplay: true
                 // ... other optional options s
             };
             var liveOptions = { 
                id: videos[counter], 
                 autoplay: true,
                 live: true
                 // ... other optional options s
             };
             videoOptions = vodOptions;
             if(counter &gt;0){
                 //live video
                 videoOptions= liveOptions;
                 //add teh sync button
             // liveSync();

             }
            console.log("player options", videoOptions);

            window.player = new PlayerSdk("#imageDiv", videoOptions);
            player.addEventListener('play', function() { 
                //console.log("playing");
                onPlay(counter);
            });
            player.addEventListener('ended',function() { 
                console.log("ended");
                counter ++;
                //if we hit the end of the array - start over again

                onEnd(counter);
            });

        }


        function onPlay(counter) {
           // console.log("onPlay");
            console.log("counter" ,counter);

            console.log("video playing");
        }
        function onEnd(counter){
            //console.log("onEnd");

            //console.log("video over");
            player.destroy();
            //video is over - so start another one...
            createVideo(counter);
        }
        //end code lifted from playlist demo

}
```
Enter fullscreen mode Exit fullscreen mode

We create a video playlist- where the first video is a 10 second countdown - like from the movies, and the 2nd video is the livestream.

This gives the live video enough time to build up a buffer and be ready to play, and makes for a fun experience for viewers.

Conclusion

In this post, we've walked through several different ways to convert a recorded video into a livestream, all using FFMPEG in the background. We've covered a basic command line implementation, and then shown how to schedule that same command.

We've also built a sample application based on NodeJS at share.a.video that does the same thing, but on a remote server, and has built in webviews to start and watch the videos.

If you still have questions (or want to share how you are piping recored video into a livestream - join the conversation in the api.video developer community.

Top comments (0)