<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Tim</title>
    <description>The latest articles on DEV Community by Tim (@tbobker).</description>
    <link>https://dev.to/tbobker</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/tbobker"/>
    <language>en</language>
    <item>
      <title>Using Ant Media Server's JavaScript SDK to Live Stream with a Virtual Background</title>
      <dc:creator>Tim</dc:creator>
      <pubDate>Wed, 16 Aug 2023 10:39:25 +0000</pubDate>
      <link>https://dev.to/antmediaserver/using-ant-media-servers-javascript-sdk-to-live-stream-with-a-virtual-background-4h69</link>
      <guid>https://dev.to/antmediaserver/using-ant-media-servers-javascript-sdk-to-live-stream-with-a-virtual-background-4h69</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;Original blog post at &lt;a href="https://antmedia.io/ant-media-server-virtual-background/"&gt;https://antmedia.io/ant-media-server-virtual-background/&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  How to use Virtual Backgrounds with Ant Media Server’s JavaScript SDK?
&lt;/h2&gt;

&lt;p&gt;Using a virtual background with Ant Media Server’s JavaScript SDK is very easy! Just follow these simple steps to apply a virtual background on your publishing page.&lt;/p&gt;

&lt;p&gt;The two main WebRTCAdaptor methods that are used to apply the background replacement are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Set the background image: &lt;code&gt;webRTCAdaptor.setBackgroundImage()&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Apply the background image: &lt;code&gt;webRTCAdaptor.enableEffect()&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There are only a few basic changes needed to apply the logic to add the feature.&lt;/p&gt;

&lt;p&gt;Here is a very basic publish page that has two buttons, one for publishing and one for unpublishing and as you can see, a very boring background.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CkJqXL66--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/m27cq0xhhuskuz5niy3y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CkJqXL66--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/m27cq0xhhuskuz5niy3y.png" alt="Image description" width="800" height="623"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;and here is the code inside the &lt;code&gt;&amp;lt;body&amp;gt;&lt;/code&gt; tag:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;div class="container"&amp;gt;

    &amp;lt;video id="localVideo" autoplay muted controls playsinline width="800px"&amp;gt; &amp;lt;/video&amp;gt;

    &amp;lt;div class="d-block"&amp;gt;
      &amp;lt;button id="start_publish"&amp;gt;Publish&amp;lt;/button&amp;gt;
      &amp;lt;button id="stop_publish"&amp;gt;Unpublish&amp;lt;/button&amp;gt;
      &amp;lt;p id="status"&amp;gt;Status:Offline&amp;lt;/p&amp;gt;
    &amp;lt;/div&amp;gt;


   &amp;lt;/div&amp;gt;

   &amp;lt;script type="module"&amp;gt;

   import {WebRTCAdaptor, VideoEffect} from "./node_modules/@antmedia/webrtc_adaptor/dist/es/index.js";

    const webRTCAdaptor = new WebRTCAdaptor({
      websocket_url: "wss://ant-media-server:5443/LiveApp/websocket",
      localVideoElement: document.getElementById("localVideo"),
      mediaConstraints: {
        video: true,
        audio:true
      },
      callback: (info, obj) =&amp;gt; {
        console.log("callback info: " + info);
        if (info == "publish_started") {
            console.log("publish started");
            statusInfo.innerHTML = "Status:Broadcasting"
        }
        else if (info == "publish_finished") {
            console.log("publish finished")
            statusInfo.innerHTML = "Status:Offline"
        }
        else if( info == "available_devices"){
          console.log(obj)
        }
      },
      callbackError: function (error, message) {
        //some of the possible errors, NotFoundError, SecurityError,PermissionDeniedError
        console.log("error callback: " + JSON.stringify(error));
      }

    });

    //get random streamId
    const streamId = "stream" + parseInt(Math.random()*999999);
    const startPublishButton = document.getElementById("start_publish")
    const stopPublishButton = document.getElementById("stop_publish")
    const statusInfo = document.getElementById("status");

    startPublishButton.addEventListener("click", () =&amp;gt; {
      console.log("start publish is clicked " + streamId);
      webRTCAdaptor.publish(streamId);
    });

    stopPublishButton.addEventListener("click", () =&amp;gt; {
      console.log("stop publish is clicked");
      webRTCAdaptor.stop(streamId);
    });

&amp;lt;/script&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As you can see, it's a very basic WebRTC publish page implementation using the WebRTC Adaptor. To use this publish page, remember to update the &lt;code&gt;websocket_url&lt;/code&gt; URL with your own Ant Media Server instance.&lt;/p&gt;

&lt;p&gt;It’s pulling the WebRTC Adaptor from NPM and importing the &lt;code&gt;WebRTCAdaptor&lt;/code&gt; and &lt;code&gt;VideoEffect&lt;/code&gt; classes which is what we’ll need to publish WebRTC.&lt;/p&gt;

&lt;p&gt;Then apply an event handler to the two buttons to publish and unpublish the live stream which calls the SDK methods &lt;code&gt;webRTCAdaptor.publish(streamId)&lt;/code&gt; and &lt;code&gt;webRTCAdaptor.stop(streamId)&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1: Make Virtual Background Images Available
&lt;/h2&gt;

&lt;p&gt;The images that you want to have available as virtual images, need to be added into a directory accessible by your publish page.&lt;/p&gt;

&lt;p&gt;In this example, all the images are located in the images directory. Lets create the HTML that will display all the images under the video player so we can select the desired background.&lt;/p&gt;

&lt;p&gt;Add the following HTML under the buttons.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;div class="col-sm-12 text-center" id="virtualBackgrounds"&amp;gt;

   &amp;lt;h3&amp;gt;Select a virtual background Image&amp;lt;/h3&amp;gt;
   &amp;lt;div class="d-flex" id="virtualbackgroundimages"&amp;gt;

     &amp;lt;img src="./images/noeffect-background.png"  id="noeffect" class="backgroundImages selected" /&amp;gt;
     &amp;lt;img src="./images/blur-background.png" id="blur" class="backgroundImages " /&amp;gt;
     &amp;lt;img src="./images/virtual-background.png" id="antMediaBackground" class="backgroundImages "/&amp;gt;
     &amp;lt;img src="./images/cloud-background.png" class="backgroundImages"/&amp;gt;

     &amp;lt;img src="./images/unsplash.jpg" class="backgroundImages"/&amp;gt;
     &amp;lt;img src="./images/huy.jpg" class="backgroundImages"/&amp;gt;
     &amp;lt;img src="./images/mokry.jpg" class="backgroundImages"/&amp;gt;
     &amp;lt;img src="./images/mitchell.jpg" class="backgroundImages"/&amp;gt;
   &amp;lt;/div&amp;gt;

   &amp;lt;div class="d-block"&amp;gt;
   &amp;lt;h3&amp;gt;Upload a Virtual Background Image&amp;lt;/h3&amp;gt;
   &amp;lt;input type="file" class="custom-file-input" id="customFile" style="height:100%;" accept=".jpg, .png, .jpeg"&amp;gt;
   &amp;lt;/div&amp;gt;
 &amp;lt;/div&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Make sure to update this code with the names of the images you have in your directory. You’ll also notice there is also a section to upload a background image. We’ll get to that later.&lt;/p&gt;

&lt;p&gt;It now looks like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qYnIJeoF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xnnowqys3pn9f5bbd8hg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qYnIJeoF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xnnowqys3pn9f5bbd8hg.png" alt="Image description" width="800" height="818"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Notice that we also have two images that do not look like backgrounds on the far left. These will function to either remove the virtual background or blur the background.&lt;/p&gt;

&lt;p&gt;Right now, clicking these images do nothing! So its time to use the SDK to help us add the virtual background functionality.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 2: Add a Click Handler to the Images
&lt;/h2&gt;

&lt;p&gt;To make things easy and quick, we’ll use the JavaScript library jQuery to apply event handlers, for example, when you click on the desired background image, a Javascript method is called to apply the virtual background.&lt;/p&gt;

&lt;p&gt;Lets start by adding that click handler.&lt;/p&gt;

&lt;p&gt;All the images have a class name of backgroundImages. So lets use that class and register a click handler.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$(".backgroundImages").click(enableVirtualBackground);
function enableVirtualBackground(){
 // add the magic here
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As you can see, we are registering a click handler to each image and calling a method enableVirtualBackground thats directly underneath.&lt;/p&gt;

&lt;p&gt;This is where all the magic will happen.&lt;/p&gt;

&lt;p&gt;Add this bit of JavaScript right under the import statement that imports the WebRTCAdaptor&lt;/p&gt;

&lt;p&gt;Lets also not forget about uploading custom backgrounds.&lt;/p&gt;

&lt;p&gt;There is also a upload section, so we’ll need to hook this up with an event handler to capture the image thats uploaded.&lt;/p&gt;

&lt;p&gt;Here is the code sample that does that:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$('input[type="file"]').change(function(e) {
     const url = URL.createObjectURL(e.target.files[0]);
     const newBackgroundImg = $("&amp;lt;img src=\"" + url + "\" class=\"backgroundImages\"/&amp;gt;");
     $("#virtualbackgroundimages").append(newBackgroundImg);
     $(".backgroundImages").click(enableVirtualBackground);
     newBackgroundImg.click();
 });
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;What this is doing is listening for when a file is selected. Once thats been done, its using &lt;code&gt;URL.createObjectURL&lt;/code&gt; JavaScript function to create a special object URL (Uniform Resource Locator) that represents the uploaded image.&lt;/p&gt;

&lt;p&gt;Then a new image is created and appended to the list of current background images in the HTML.&lt;/p&gt;

&lt;p&gt;Once it has been appended, apply the click handler as before to all the images and simulate a click on the newly uploaded image to apply the virtual background.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3: Enable Virtual Backgrounds
&lt;/h2&gt;

&lt;p&gt;Now its time to get our virtual backgrounds working!&lt;/p&gt;

&lt;p&gt;You might have noticed that the two images that either blur the background or remove the virtual background have an id of &lt;code&gt;blur&lt;/code&gt; or &lt;code&gt;noeffect&lt;/code&gt; respectively.&lt;/p&gt;

&lt;p&gt;We’ll be using VideoEffect to configure the background image and webRTCAdaptor to apply the virtual background.&lt;/p&gt;

&lt;p&gt;Here is the full working code sample for &lt;code&gt;enableVirtualBackground&lt;/code&gt; method:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;function enableVirtualBackground() {
     var effectName;
     if ($(this).attr("id") == "blur") {
         effectName = VideoEffect.BLUR_BACKGROUND;
     }
     else if ($(this).attr("id") == "noeffect") {
         effectName = VideoEffect.NO_EFFECT;
     }
     else {
         effectName = VideoEffect.VIRTUAL_BACKGROUND;
         webRTCAdaptor.setBackgroundImage(this);
     }
     webRTCAdaptor.enableEffect(effectName).then(() =&amp;gt; {
         console.log("Effect: "+ effectName+" is enabled");
     }).catch(err =&amp;gt; {
         console.error("Effect: "+ effectName+" is not enabled. Error is " + err);
         $.notify(err.name, {
             autoHideDelay:5000,
             className:'error',
             position:'top center'
         });
     });
     $(this).addClass("selected");
 }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The first part is detecting whether or not we want to remove the virtual background or blur the background and assigning the necessary &lt;code&gt;VideoEffect&lt;/code&gt; property (&lt;code&gt;VideoEffect.NO_EFFECT&lt;/code&gt; or &lt;code&gt;VideoEffect.BLUR_BACKGROUND)&lt;/code&gt; based on the ID of the image that has been clicked.&lt;/p&gt;

&lt;p&gt;Otherwise, we want to apply the image as a virtual background, so assigning &lt;code&gt;VideoEffect.VIRTUAL_BACKGROUND&lt;/code&gt; is necessary followed by calling the webRTCAdaptor.setBackgroundImage to set the image as the virtual background image.&lt;/p&gt;

&lt;p&gt;Once thats been done, all thats left is to call the &lt;code&gt;webRTCAdaptor.enableEffect&lt;/code&gt; to enable the virtual background.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 4: Test the Virtual Background
&lt;/h2&gt;

&lt;p&gt;Now its time to test that this works.&lt;/p&gt;

&lt;p&gt;You should now be able to apply a virtual background from one of the images available or upload your own.&lt;/p&gt;

&lt;p&gt;I’ve decided to be on the beach for this example.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--40QR92eH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g2afka0wzbodsjyee6qe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--40QR92eH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g2afka0wzbodsjyee6qe.png" alt="Image description" width="800" height="824"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now try live streaming and changing your background on the fly. You should see it change almost immediately.&lt;/p&gt;

&lt;h2&gt;
  
  
  Full Code Sample:
&lt;/h2&gt;

&lt;p&gt;Here is the full source code of the page. Remember to update the websocket_url of your server.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;!DOCTYPE html&amp;gt;

&amp;lt;html lang="en"&amp;gt;
&amp;lt;head&amp;gt;
  &amp;lt;meta charset="UTF-8"&amp;gt;
  &amp;lt;meta http-equiv="X-UA-Compatible" content="IE=edge"&amp;gt;
  &amp;lt;meta name="viewport" content="width=device-width, initial-scale=1.0"&amp;gt;
  &amp;lt;title&amp;gt;Document&amp;lt;/title&amp;gt;
  &amp;lt;script src="https://code.jquery.com/jquery-3.6.0.min.js"&amp;gt;&amp;lt;/script&amp;gt;

  &amp;lt;style&amp;gt;
    body{
      font-family: Tahoma;
    }
    #virtualBackgrounds img{
      height: 50px;
    }
    .container{
      margin:auto;
      width: 800px;
      text-align:center;
    }
  &amp;lt;/style&amp;gt;

&amp;lt;/head&amp;gt;
&amp;lt;body&amp;gt;

  &amp;lt;div class="container"&amp;gt;

    &amp;lt;video id="localVideo" autoplay muted controls playsinline width="800px"&amp;gt; &amp;lt;/video&amp;gt;

    &amp;lt;div class="d-block"&amp;gt;
      &amp;lt;button id="start_publish"&amp;gt;Publish&amp;lt;/button&amp;gt;
      &amp;lt;button id="stop_publish"&amp;gt;Unpublish&amp;lt;/button&amp;gt;
      &amp;lt;p id="status"&amp;gt;Status:Offline&amp;lt;/p&amp;gt;
    &amp;lt;/div&amp;gt;


   &amp;lt;div class="col-sm-12 text-center" id="virtualBackgrounds"&amp;gt;
      &amp;lt;h3&amp;gt;Select a virtual background Image&amp;lt;/h3&amp;gt;
      &amp;lt;div class="d-flex" id="virtualbackgroundimages"&amp;gt;

        &amp;lt;img src="./images/noeffect-background.png"  id="noeffect" class="backgroundImages selected" /&amp;gt;
        &amp;lt;img src="./images/blur-background.png" id="blur" class="backgroundImages " /&amp;gt;
        &amp;lt;img src="./images/virtual-background.png" id="antMediaBackground" class="backgroundImages "/&amp;gt;
        &amp;lt;img src="./images/cloud-background.png" class="backgroundImages"/&amp;gt;

        &amp;lt;img src="./images/unsplash.jpg" class="backgroundImages"/&amp;gt;
        &amp;lt;img src="./images/huy.jpg" class="backgroundImages"/&amp;gt;
        &amp;lt;img src="./images/mokry.jpg" class="backgroundImages"/&amp;gt;
        &amp;lt;img src="./images/mitchell.jpg" class="backgroundImages"/&amp;gt;
      &amp;lt;/div&amp;gt;

      &amp;lt;div class="d-block"&amp;gt;
      &amp;lt;h3&amp;gt;Upload a Virtual Background Image&amp;lt;/h3&amp;gt;
      &amp;lt;input type="file" class="custom-file-input" id="customFile" style="height:100%;" accept=".jpg, .png, .jpeg"&amp;gt;
      &amp;lt;/div&amp;gt;
    &amp;lt;/div&amp;gt;

   &amp;lt;/div&amp;gt;

   &amp;lt;script type="module"&amp;gt;

   import {WebRTCAdaptor, VideoEffect} from "./node_modules/@antmedia/webrtc_adaptor/dist/es/index.js";

   $(".backgroundImages").click(enableVirtualBackground);

   $('input[type="file"]').change(function(e) {
        const url = URL.createObjectURL(e.target.files[0]);
        const newBackgroundImg = $("&amp;lt;img src=\"" + url + "\" class=\"backgroundImages\"/&amp;gt;");
        $("#virtualbackgroundimages").append(newBackgroundImg);
        $(".backgroundImages").click(enableVirtualBackground);
        newBackgroundImg.click();
    });

   function enableVirtualBackground() {
        var effectName;
        if ($(this).attr("id") == "blur") {
            effectName = VideoEffect.BLUR_BACKGROUND;
        }
        else if ($(this).attr("id") == "noeffect") {
            effectName = VideoEffect.NO_EFFECT;
        }
        else {
            effectName = VideoEffect.VIRTUAL_BACKGROUND;
            webRTCAdaptor.setBackgroundImage(this);
        }
        webRTCAdaptor.enableEffect(effectName).then(() =&amp;gt; {
            console.log("Effect: "+ effectName+" is enabled");
        }).catch(err =&amp;gt; {
            console.error("Effect: "+ effectName+" is not enabled. Error is " + err);
            $.notify(err.name, {
                autoHideDelay:5000,
                className:'error',
                position:'top center'
            });
        });
        $(this).addClass("selected");
    }

    const webRTCAdaptor = new WebRTCAdaptor({
      websocket_url: "wss://ant-media-server:5443/LiveApp/websocket",
      localVideoElement: document.getElementById("localVideo"),
      mediaConstraints: {
        video: true,
        audio:true
      },
      callback: (info, obj) =&amp;gt; {
        console.log("callback info: " + info);
        if (info == "publish_started") {
            console.log("publish started");
            statusInfo.innerHTML = "Status:Broadcasting"
        }
        else if (info == "publish_finished") {
            console.log("publish finished")
            statusInfo.innerHTML = "Status:Offline"
        }
        else if( info == "available_devices"){
          console.log(obj)
        }
      },
      callbackError: function (error, message) {
        console.log("error callback: " + JSON.stringify(error));
      }

    });

    const streamId = "stream" + parseInt(Math.random()*999999);
    const startPublishButton = document.getElementById("start_publish")
    const stopPublishButton = document.getElementById("stop_publish")
    const statusInfo = document.getElementById("status");

    startPublishButton.addEventListener("click", () =&amp;gt; {
      console.log("start publish is clicked " + streamId);
      webRTCAdaptor.publish(streamId);
    });

    stopPublishButton.addEventListener("click", () =&amp;gt; {
      console.log("stop publish is clicked");
      webRTCAdaptor.stop(streamId);
    });

&amp;lt;/script&amp;gt;
&amp;lt;/body&amp;gt;
&amp;lt;/html&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;There is also a sample page in the default applications located at the following URL &lt;a href="https://ant-media-server:5443/%7Bapplication%7D/publish_webrtc_virtual_background.html"&gt;https://ant-media-server:5443/{application}/publish_webrtc_virtual_background.html&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;That sample page is also available on &lt;a href="https://github.com/ant-media/StreamApp/blob/master/src/main/webapp/samples/publish_webrtc_virtual_background_frame.html"&gt;Github&lt;/a&gt; and you can see a &lt;a href="https://antmedia.io/webrtc-samples/webrtc-virtual-background/"&gt;live demo&lt;/a&gt; on our &lt;a href="https://antmedia.io/webrtc-samples/"&gt;samples page&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What Next?
&lt;/h2&gt;

&lt;p&gt;Explore the &lt;a href="https://antmedia.io/docs/guides/developer-sdk-and-api/sdk-integration/javascript-sdk/"&gt;JavaScript SDK&lt;/a&gt; reference guide and other popular SDKs, the &lt;a href="https://antmedia.io/docs"&gt;documentation&lt;/a&gt; or join the &lt;a href="https://github.com/orgs/ant-media/discussions"&gt;community discussions&lt;/a&gt; to learn more about the Ant Media Server.&lt;/p&gt;

&lt;p&gt;Deployment options include other &lt;a href="https://antmedia.io/#selfhosted"&gt;1-click apps&lt;/a&gt;, cloud marketplaces, Docker/Kubernetes/Scripts on various cloud providers such as AWS, Microsoft Azure, Digital Ocean, Linode, Google, and Alibaba.&lt;/p&gt;

&lt;p&gt;A &lt;a href="https://antmedia.io/free-trial/"&gt;30 day free trial&lt;/a&gt; is available to try out the Enterprise Edition of the product and &lt;a href="https://antmedia.io/support-packages/"&gt;support packages&lt;/a&gt; are available for both editions if you need some extra help getting started.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Original blog post at &lt;a href="https://antmedia.io/ant-media-server-virtual-background/"&gt;https://antmedia.io/ant-media-server-virtual-background/&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

</description>
    </item>
    <item>
      <title>WebRTC Cheatsheet</title>
      <dc:creator>Tim</dc:creator>
      <pubDate>Sun, 21 May 2023 22:37:41 +0000</pubDate>
      <link>https://dev.to/antmediaserver/webrtc-cheatsheet-1aln</link>
      <guid>https://dev.to/antmediaserver/webrtc-cheatsheet-1aln</guid>
      <description>&lt;h2&gt;
  
  
  🌐 WebRTC Basics:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;WebRTC&lt;/strong&gt; stands for Web Real-Time Communication.&lt;/li&gt;
&lt;li&gt;It is an open framework for enabling real-time communication in web browsers and mobile applications.&lt;/li&gt;
&lt;li&gt;WebRTC allows peer-to-peer audio, video, and data sharing without the need for additional plugins or software.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  📹 Video Communication:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;getUserMedia()&lt;/code&gt; API captures video from the user's camera.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MediaStream&lt;/strong&gt; represents the video and audio streams.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;RTCPeerConnection&lt;/strong&gt; establishes a direct peer-to-peer connection for video communication.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;RTCDataChannel&lt;/strong&gt; enables real-time data sharing alongside video communication.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  🔊 Audio Communication:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;getUserMedia()&lt;/code&gt; API captures audio from the user's microphone.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MediaStream&lt;/strong&gt; represents the audio stream.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;RTCPeerConnection&lt;/strong&gt; establishes a direct peer-to-peer connection for audio communication.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;RTCDataChannel&lt;/strong&gt; enables real-time data sharing alongside audio communication.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  💡 Useful JavaScript APIs:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;getUserMedia()&lt;/code&gt;: Grants access to the user's camera and microphone.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;RTCPeerConnection&lt;/strong&gt;: Handles peer-to-peer communication.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;RTCDataChannel&lt;/strong&gt;: Enables real-time data sharing between peers.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MediaStream&lt;/strong&gt;: Represents audio and video streams.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  🔒 Security and Encryption:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;WebRTC&lt;/strong&gt; uses Secure Real-Time Transport Protocol (SRTP) for encryption.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Signaling&lt;/strong&gt; is required to exchange session information and establish a connection.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Signaling servers&lt;/strong&gt; help coordinate the communication process but are not part of the WebRTC standard.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  🌐 WebRTC Frameworks and Libraries:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.simplewebrtc.com/"&gt;SimpleWebRTC&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://peerjs.com/"&gt;PeerJS&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://socket.io/"&gt;Socket.io&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.twilio.com/docs/video"&gt;Twilio Programmable Video&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/meetecho/janus-gateway"&gt;Janus Gateway&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  🚀 Deploying WebRTC:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Cloud hosting platforms like Firebase, AWS, or Heroku can be used to deploy WebRTC applications.&lt;/li&gt;
&lt;li&gt;Consider the server-side requirements for signaling, TURN, and STUN servers.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  📚 Resources:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;WebRTC API documentation: &lt;a href="https://webrtc.org/"&gt;https://webrtc.org/&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;WebRTC samples and demos: &lt;a href="https://antmedia.io/webrtc-samples/"&gt;https://antmedia.io/webrtc-samples/&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;WebRTR Live Streaming Software: &lt;a href="https://github.com/ant-media"&gt;https://github.com/ant-media&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>realtimecommunication</category>
      <category>videochat</category>
      <category>webrtc</category>
      <category>peertopeer</category>
    </item>
    <item>
      <title>3 Great Canvas Manipulations in WebRTC Live Streaming</title>
      <dc:creator>Tim</dc:creator>
      <pubDate>Fri, 19 May 2023 10:55:57 +0000</pubDate>
      <link>https://dev.to/antmediaserver/3-great-canvas-manipulations-in-webrtc-live-streaming-2gcf</link>
      <guid>https://dev.to/antmediaserver/3-great-canvas-manipulations-in-webrtc-live-streaming-2gcf</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;This blog post is originally published at &lt;a href="https://antmedia.io/how-to-merge-live-stream-and-canvas-in-webrtc-easily/"&gt;antmedia.io&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;How to manipulate live stream content in WebRTC is one of the most asked questions at Ant Media. &lt;/p&gt;

&lt;p&gt;This can be achieved by using the HTML canvas element as a live stream source in WebRTC. &lt;/p&gt;

&lt;p&gt;Here are some common questions live streamers tend to ask:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;How can I add my logo to the live stream?&lt;/li&gt;
&lt;li&gt;How can I add a watermark to my live stream?&lt;/li&gt;
&lt;li&gt;My live stream doesn’t have video, can I display a static image instead of video?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this blog post, we will show you 3 use cases of using Canvas as a live stream source:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Adding a logo to live stream&lt;/li&gt;
&lt;li&gt;Canvas with audio&lt;/li&gt;
&lt;li&gt;Putting a background image on a live stream&lt;/li&gt;
&lt;/ol&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;This tutorial uses Ant Media Server, but the technique can be applied to other WebRTC stream sources.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Adding a Logo to Live Stream
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vOz-POam--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/sau3dutrwjp5gsiy9ye9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vOz-POam--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/sau3dutrwjp5gsiy9ye9.png" alt="adding a logo to your live stream" width="800" height="534"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will use the sample page that comes with Ant Media Server to publish a live stream. &lt;/p&gt;

&lt;p&gt;You can access it at &lt;code&gt;https://ANT-MEDIA-SERVER:PORT/APPLICATION/&lt;/code&gt; on the server it's located at &lt;code&gt;/usr/local/antmedia/webapps/LiveApp/index.html&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;The idea is to use the canvas as a stream source. We will draw two things on the canvas. The first one is the content of the video component and the second one is the logo image. &lt;/p&gt;

&lt;p&gt;The logo image in our sample is a PNG file.&lt;/p&gt;

&lt;p&gt;Lets start by adding a canvas component above the video component:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;div class="col-sm-12 form-group"&amp;gt;
     &amp;lt;canvas id="canvas" width="200" height="150"&amp;gt;&amp;lt;/canvas&amp;gt;
     &amp;lt;p&amp;gt;
        &amp;lt;video id="localVideo"  autoplay muted controls playsinline&amp;gt;&amp;lt;/video&amp;gt;
    &amp;lt;/p&amp;gt;
 &amp;lt;/div&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then use this canvas component for merging the logo image and video frame. This is what our method will do:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create an Image instance from the PNG file&lt;/li&gt;
&lt;li&gt;Create a draw method where you merge the video frame and logo image. &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We'll call this method every 25ms with the help of JavaScripts &lt;code&gt;setInterval&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;Finally, capture the canvas content at 25fps using the &lt;code&gt;captureStream&lt;/code&gt; method:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;var canvas = document.getElementById('canvas');

var vid = document.getElementById('localVideo');

  var image=new Image();

  image.src="antmedia.png";

    function draw() {

    if (canvas.getContext) {

      var ctx = canvas.getContext('2d');

      ctx.drawImage(vid, 0, 0, 200, 150);

      ctx.drawImage(image,50, 10, 100, 30)
    }

}

//update canvas for every 25ms

setInterval(function() { draw(); }, 25);

//capture stream from canvas

var localStream = canvas.captureStream(25);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally, initialise the Ant Media Servers Javascript SDK's WebRTC adaptor with the &lt;code&gt;localStream&lt;/code&gt; created from the canvas. &lt;/p&gt;

&lt;p&gt;We need to pass the &lt;code&gt;localStream&lt;/code&gt; to the WebRTCAdaptor:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;navigator.mediaDevices.getUserMedia({video: true, audio:true}).then(function (stream) {

        var video = document.querySelector('video#localVideo');

        video.srcObject = stream;

        video.onloadedmetadata = function(e) {

        video.play();

      };

      //initialize the webRTCAdaptor with the localStream created.

      //initWebRTCAdaptor method is implemented below

      initWebRTCAdaptor(localStream);

  });

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When publishing starts, the canvas content will be published and you can view the live stream with the added logo.&lt;/p&gt;

&lt;h2&gt;
  
  
  Canvas with Audio
&lt;/h2&gt;

&lt;p&gt;In this sample, we will show how to publish only canvas with audio. There will not be video content. &lt;/p&gt;

&lt;p&gt;This might be useful when you only want to publish audio, i.e a radio station website.&lt;/p&gt;

&lt;p&gt;In order to achieve this, we will skip drawing the video content and just draw a rectangle on the canvas instead of a logo image file. &lt;/p&gt;

&lt;p&gt;We will use drawings to show that you can do more than putting a logo. You can also write text on the canvas.&lt;/p&gt;

&lt;p&gt;For this example, we need to change the content of the draw method. Instead of drawing an image, we draw a rectangle:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;function draw() {
   if (canvas.getContext) {
   var ctx = canvas.getContext('2d');
   ctx.fillStyle = 'rgba(0, 0, 200, 0.5)';
   ctx.fillRect(30, 30, 100, 50);
   }
 }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then we need to disable video and add only audio to &lt;code&gt;localStream&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;//get audio with getUserMedia

navigator.mediaDevices.getUserMedia({video: false, audio:true}).then(function (stream) {

  //add audio track to the localstream which is captured from canvas

  localStream.addTrack(stream.getAudioTracks()[0]);

  //initialize the webRTCAdaptor with the localStream created.

  //initWebRTCAdaptor method is implemented below

  initWebRTCAdaptor(localStream);

});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When this stream is played back, there will be a rectangle and audio. No video.&lt;/p&gt;

&lt;h2&gt;
  
  
  Putting a Background Image on a Live Stream
&lt;/h2&gt;

&lt;p&gt;This use case is also known as the green-screen effect. You can live stream using a green screen and replace that green screen with something else. &lt;/p&gt;

&lt;p&gt;For example, you can replace that green screen with a background image of your choice. Let’s see how we can do this.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--B24lJ0Fq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k30nm0k56u1z6kue7f2b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--B24lJ0Fq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k30nm0k56u1z6kue7f2b.png" alt="Replace the green colour with any background you like" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Start by adding two more canvas components. The first canvas is for a video frame, the second one is for a background image that we will use in our live stream and the third one is for merging these canvas together  :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;canvas id="canvas" width="200" height="150"&amp;gt;&amp;lt;/canvas&amp;gt;

&amp;lt;canvas id="canvas2" width="200" height="150"&amp;gt;&amp;lt;/canvas&amp;gt;

&amp;lt;canvas id="canvas3" width="200" height="150"&amp;gt;&amp;lt;/canvas&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then we find the green pixels and replace them with the background image’s pixel colours:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;var canvas = document.getElementById('canvas');

var canvas2 = document.getElementById('canvas2');

var canvas3 = document.getElementById('canvas3');

var vid = document.getElementById('localVideo');

var image=new Image();

image.src="antmedia.png";

  function draw() {

  if (canvas.getContext) {

    var ctx = canvas.getContext('2d');

    var ctx2 = canvas2.getContext('2d');

    var ctx3 = canvas3.getContext('2d');

    ctx2.drawImage(image,0, 0, 200, 150)

    let frame2 =  ctx2.getImageData(0, 0, 200, 150);

    ctx.drawImage(vid, 0, 0, 200, 150);

    let frame = ctx.getImageData(0, 0, 200, 150);

    let l = frame.data.length / 4;

    for (let i = 0; i &amp;lt; l; i++) {

      let r = frame.data[i * 4 + 0];

      let g = frame.data[i * 4 + 1];

      let b = frame.data[i * 4 + 2];

      if (g &amp;gt; 100 &amp;amp;&amp;amp; r &amp;gt; 100 &amp;amp;&amp;amp; b &amp;lt; 43){

        frame.data[i * 4 + 0] = frame2.data[i * 4 + 0]

        frame.data[i * 4 + 1] = frame2.data[i * 4 + 1]

        frame.data[i * 4 + 2] = frame2.data[i * 4 + 2]

        }

      }

    ctx3.putImageData(frame, 0, 0);

  }

}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally, we use the canvas3 for capturing the stream:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;//update canvas for every 40ms

setInterval(function() { draw(); }, 25);

//capture stream from canvas

var localStream = canvas3.captureStream(25);

//get audio with getUserMedia

navigator.mediaDevices.getUserMedia({video: true, audio:true}).then(function (stream) {

var video = document.querySelector('video#localVideo');

      video.srcObject = stream;

      video.onloadedmetadata = function(e) {

        video.play();

      };

//initialize the webRTCAdaptor with the localStream created.

//initWebRTCAdaptor method is implemented below

initWebRTCAdaptor(localStream);

});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This now completes the tutorial. Hope you enjoyed the post. You might want to check &lt;a href="https://antmedia.io/how-to-embed-webrtc-live-streaming-into-your-website/"&gt;How to Embed WebRTC Live Streaming into Your Website in 2 Easy Ways?&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;This blog post is originally published at &lt;a href="https://antmedia.io/how-to-merge-live-stream-and-canvas-in-webrtc-easily/"&gt;antmedia.io&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

</description>
    </item>
    <item>
      <title>How to Create a Kubernetes Cluster on DigitalOcean</title>
      <dc:creator>Tim</dc:creator>
      <pubDate>Thu, 27 Apr 2023 10:07:17 +0000</pubDate>
      <link>https://dev.to/antmediaserver/how-to-create-a-kubernetes-cluster-on-digitalocean-dno</link>
      <guid>https://dev.to/antmediaserver/how-to-create-a-kubernetes-cluster-on-digitalocean-dno</guid>
      <description>&lt;h2&gt;
  
  
  Kubernetes Cluster Creation on DigitalOcean
&lt;/h2&gt;

&lt;p&gt;To create a Kubernetes cluster:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;From the Create menu in the control panel, click Kubernetes.&lt;/li&gt;
&lt;li&gt;Select a Kubernetes version. The latest version is selected by default and is the best choice if you have no specific need for an earlier version.&lt;/li&gt;
&lt;li&gt;Choose a data center region.&lt;/li&gt;
&lt;li&gt;Customize the default node pool, choose the node pool names, and add additional node pools.&lt;/li&gt;
&lt;li&gt;Name the cluster, select the project you want the cluster to belong to, and optionally add a tag. Any tags you choose will be applied to the cluster and its worker nodes.&lt;/li&gt;
&lt;li&gt;Click Create Cluster. Provisioning the cluster takes several minutes.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jJ8A9zwh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/odk2d9pb0ro6lx6t2g4l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jJ8A9zwh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/odk2d9pb0ro6lx6t2g4l.png" alt="Creating a Kubernetes cluster on DigitalOcean" width="800" height="717"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Verify the Cluster was created
&lt;/h2&gt;

&lt;p&gt;Download the cluster configuration file by clicking Actions, then Download the Config from the cluster home page.&lt;/p&gt;

&lt;p&gt;run the following kubectl command to check nodes:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;kubectl –kubeconfig={CONFIG FILE PATH} get nodes&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;You should get nodes as in the following image:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0I2_m2Zw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/f1v3ht0zwsfs83wrdfhb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0I2_m2Zw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/f1v3ht0zwsfs83wrdfhb.png" alt="List of Nodes returned from kubectl command" width="800" height="84"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Deploying An Application on Kubernetes
&lt;/h2&gt;

&lt;p&gt;After this point, you have a running Kubernetes Cluster and it is time to use it! &lt;/p&gt;

&lt;p&gt;Now it's time to deploy an application on the Kubernetes cluster. As a suggestion, you can use Ant Media Server an Ultra-low latency WebRTC live streaming engine. &lt;/p&gt;

&lt;p&gt;Simply follow the documentation &lt;a href="https://antmedia.io/docs/guides/clustering-and-scaling/kubernetes/deploy-ams-on-kubernetes/"&gt;here&lt;/a&gt; &lt;/p&gt;

</description>
      <category>kubernetes</category>
      <category>digitalocean</category>
      <category>cloud</category>
      <category>docker</category>
    </item>
    <item>
      <title>Build a custom live streaming engine with Ant Media Server plugins</title>
      <dc:creator>Tim</dc:creator>
      <pubDate>Mon, 24 Apr 2023 13:47:40 +0000</pubDate>
      <link>https://dev.to/antmediaserver/build-a-custom-live-streaming-engine-with-ant-media-server-plugins-3056</link>
      <guid>https://dev.to/antmediaserver/build-a-custom-live-streaming-engine-with-ant-media-server-plugins-3056</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--OKXlX-Xt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lzqkymd1ebb3vrub5f63.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--OKXlX-Xt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lzqkymd1ebb3vrub5f63.png" alt="Image description" width="800" height="450"&gt;&lt;/a&gt;&lt;br&gt;
When content management solutions introduced a plugin ecosystem, it revolutionised how we can customise and extend websites. &lt;/p&gt;

&lt;p&gt;The ability to easily install a plugin that provides some specific functionality for a specific use case feels like a dream and it still is. &lt;/p&gt;

&lt;p&gt;WordPress has taken it to another level with enough plugins to totally transform a WordPress blog into a fully-fledged social media network, video hosting platforms like Youtube or a highly scalable e-commerce store. &lt;/p&gt;

&lt;p&gt;With the rise of live streaming, the large players in the space such as Wowza, Agora and Castr are offering services to allow influencers, live auction houses and live shopping services to name just a few use cases to integrate live streaming into their applications. &lt;/p&gt;

&lt;p&gt;But, you are still restricted to what the service has to offer. For example, what about integrating ad insertion, AI or a unique, amazing effect you want to apply to a live stream. It's not possible unless the service offers an API or a way of interacting with the live stream. At least, not until now! &lt;/p&gt;

&lt;h2&gt;
  
  
  Ant Media Server Streaming Engine
&lt;/h2&gt;

&lt;p&gt;If you haven't heard of &lt;a href="https://antmedia.io"&gt;Ant Media Server&lt;/a&gt;, let me briefly explain it. &lt;/p&gt;

&lt;p&gt;Ant Media Server is a live streaming engine built from Java that can be hosted on your own servers to enable a fully featured live streaming engine service. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Features include:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Multiple ingest formats like SRT, RTMP, WebRTC&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;WebRTC ~0.5 second latency&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Restreaming to multiple endpoints e.g. social networks&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Live stream recording locally or directly to Amazon S3&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Multiple SDKs: Android, iOS, Flutter, React, React Native&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Scaling and clusting to 500K viewers+&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The software can support all of the most popular use cases you can think of and has an extremely reasonable licensing package. There is also a 1-click installation in the most popular cloud marketplaces like Amazon, Digital ocean and Vultr. &lt;/p&gt;

&lt;h2&gt;
  
  
  Ant Media Server Plugins, Applications and Marketplace
&lt;/h2&gt;

&lt;p&gt;But Ant Media have now introduced a full-blown plugin architecture that can enable the customisation of the system. &lt;/p&gt;

&lt;p&gt;They have launched a marketplace and there are already a few unique plugins that can be downloaded for free to extend the default offering. &lt;/p&gt;

&lt;p&gt;A few plugins/applications from the marketplace include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Portmeet&lt;/strong&gt; - a Zoom-like application that enables a conferencing solution&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Filter Plugin&lt;/strong&gt; - add filters to the ongoing stream&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;OBS Castify&lt;/strong&gt; - WebRTC for OBS (20MS P2P latency)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Stamp Plugin&lt;/strong&gt; - add overlays to the video stream&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;DuetMaster&lt;/strong&gt; - play a piano duet in Real-time&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These are a few plugins and applications that are currently available in the marketplace ready to download and extend your Ant Media Server instance. &lt;/p&gt;

&lt;h2&gt;
  
  
  How to get started building plugins and applications
&lt;/h2&gt;

&lt;p&gt;While the documentation for plugins and applications is not yet the best, there are a few resources available to get started. &lt;/p&gt;

&lt;p&gt;There is a blog post &lt;a href="https://antmedia.io/plugin-development-guide/"&gt;plugin development guide&lt;/a&gt; and also a Youtube video &lt;a href="https://www.youtube.com/watch?v=BYkUCPrTrL4"&gt;Introducing the Plugin Mechanism&lt;/a&gt; but not much else yet. &lt;/p&gt;

&lt;h3&gt;
  
  
  What's the difference between plugins and Applications
&lt;/h3&gt;

&lt;p&gt;A plugin is what will interact with the live stream and perform some pre-processing or capture some kind of data. &lt;/p&gt;

&lt;p&gt;An application is a way to build a custom user interface to interact with the live streaming engine. &lt;/p&gt;

&lt;p&gt;An example is the &lt;a href="https://antmedia.io/duetmaster-app-2/"&gt;DuetMaster&lt;/a&gt; application in the marketplace. &lt;/p&gt;

&lt;p&gt;It's a custom interface that uses the live streaming engine to send and process the data sent and received from the Ant Media Server instance to enable you to play the virtual piano with someone. &lt;/p&gt;

&lt;p&gt;I think the &lt;a href="https://antmedia.io/marketplace/"&gt;marketplace&lt;/a&gt; and the flexibility of Ant Media Server are going to become more and more essential for the ever-growing demand for live streaming. &lt;/p&gt;

</description>
      <category>webrtc</category>
      <category>livestreaming</category>
      <category>hls</category>
      <category>plugin</category>
    </item>
  </channel>
</rss>
