<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Dylan Jhaveri</title>
    <description>The latest articles on DEV Community by Dylan Jhaveri (@dylanjha).</description>
    <link>https://dev.to/dylanjha</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/dylanjha"/>
    <language>en</language>
    <item>
      <title>Mux is the video API for the JAMstack</title>
      <dc:creator>Dylan Jhaveri</dc:creator>
      <pubDate>Wed, 08 Apr 2020 16:28:01 +0000</pubDate>
      <link>https://dev.to/mux/mux-is-the-video-api-for-the-jamstack-3po1</link>
      <guid>https://dev.to/mux/mux-is-the-video-api-for-the-jamstack-3po1</guid>
      <description>&lt;h1&gt;
  
  
  What is the JAMstack?
&lt;/h1&gt;

&lt;p&gt;The JAMstack is a term popularized in the last year, largely by the React community and companies like &lt;a href="https://www.netlify.com/" rel="noopener noreferrer"&gt;Netlify&lt;/a&gt; and &lt;a href="https://zeit.co/" rel="noopener noreferrer"&gt;Zeit&lt;/a&gt;. Specifically, JAMstack stands for "Javascript", "APIs" and "Markup". These terms don't exactly describe what the JAMstack is in a clear way, but the name itself has a nice ring to it so it seems to have stuck.&lt;/p&gt;

&lt;p&gt;Here is a breakdown of all the pieces for a "JAMstack" application and what some of the popular options are. For a more exhaustive list you might check out &lt;a href="https://github.com/automata/awesome-jamstack" rel="noopener noreferrer"&gt;awesome-jamstack on Github&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Static content frameworks
&lt;/h2&gt;

&lt;p&gt;This covers the "Javascript" and "Markup" part of the stack.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://nextjs.org/" rel="noopener noreferrer"&gt;Next.js&lt;/a&gt;: Open source, write everything with React and the framework gives you automatic code splitting and a server-side rendered web application.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.gatsbyjs.com/" rel="noopener noreferrer"&gt;Gatsby&lt;/a&gt;: Also open source and you write everything with React components. The Gatsby framework handles code splitting and lazy loading resources. Gatsby also has a concept of “sources” where you can write GraphQL queries to pull in data from 3rd party sources via a plugin.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.11ty.dev/" rel="noopener noreferrer"&gt;11ty&lt;/a&gt;: A static site generator that works with all kinds of templates: markdown, liquid templates, nunjucks, handlebars, mustache, ejs, haml, pug and Javascript template literals&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Deploy
&lt;/h2&gt;

&lt;p&gt;These are platforms that can host your statically built application. With common JAMstack frameworks you end up with static files that can be hosted by a static file server and delivered over a CDN.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://zeit.co/" rel="noopener noreferrer"&gt;Zeit&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.netlify.com/" rel="noopener noreferrer"&gt;Netlify&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://firebase.google.com/docs/hosting" rel="noopener noreferrer"&gt;Firebase hosting&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://surge.sh/" rel="noopener noreferrer"&gt;Surge.sh&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://render.com/" rel="noopener noreferrer"&gt;Render&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://aws.amazon.com/s3/" rel="noopener noreferrer"&gt;AWS S3&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Cloud Functions (“Serverless”)
&lt;/h2&gt;

&lt;p&gt;All of these services, in one way or another, allow you to write code in javascript that handles an API request and returns a response. This, along with other 3rd party APIs is the "API" part of the stack. The serverless part is that you don’t have to worry about the details on how or where that code gets run. These platforms will handle the server configuration and the deployment of your API endpoints as “cloud functions” or “lambdas”. In your client side application, you can make requests to these functions the same way you would make requests to API endpoints that you would have deployed to your own traditional server.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://aws.amazon.com/lambda/" rel="noopener noreferrer"&gt;AWS Lambda&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://firebase.google.com/docs/functions/" rel="noopener noreferrer"&gt;Firebase Cloud Functions&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://workers.cloudflare.com/" rel="noopener noreferrer"&gt;Cloudflare Workers&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://nextjs.org/docs/api-routes/introduction" rel="noopener noreferrer"&gt;Zeit API Routes&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.netlify.com/functions/overview/" rel="noopener noreferrer"&gt;Netlify Functions&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Headless CMS
&lt;/h2&gt;

&lt;p&gt;A “headless” CMS is a CMS that gives you and your team an interface to log in, edit content, add new content, upload assets and the “publish” data that makes it into your website or application.&lt;/p&gt;

&lt;p&gt;There are many headless CMSes. We are a little biased, so these are the one ones that work with Mux and these are the ones that we have used. Look around for what works for you. And if you have one that you want to use with Mux, let us know and we can build an integration.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.sanity.io/" rel="noopener noreferrer"&gt;Sanity&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.contentful.com/" rel="noopener noreferrer"&gt;Contentful&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.datocms.com/" rel="noopener noreferrer"&gt;Dato&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.cosmicjs.com/" rel="noopener noreferrer"&gt;Cosmic JS&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Authentication (advanced)
&lt;/h2&gt;

&lt;p&gt;If you’re building a static marketing site you probably will not need to deal with authentication. However, for a more advanced application you will need to have users login, reset passwords and do all the pieces of authentication.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://auth0.com/" rel="noopener noreferrer"&gt;Auth0&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://firebase.google.com/docs/auth" rel="noopener noreferrer"&gt;Firebase auth&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.netlify.com/visitor-access/identity/" rel="noopener noreferrer"&gt;Netlify Identity&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Database (advanced)
&lt;/h2&gt;

&lt;p&gt;If you are authenticating users and dealing with logged in sessions, you probably need a database. These are commonly used for JAMstack applications.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://firebase.google.com/" rel="noopener noreferrer"&gt;Firebase&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://fauna.com/" rel="noopener noreferrer"&gt;FaunaDB&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  How did we get here?
&lt;/h1&gt;

&lt;p&gt;Before these tools gained popularity the answer to “What stack should I use for my marketing site?” might have been “use Rails” and that is a clear answer. But now if someone says “use the JAMstack” well, that is a complicated answer. It’s a little misleading to call the “JAMstack” a specific stack, because as you can see from above, even if you decided to use the JAMstack, you still have a lot of choices to make.&lt;/p&gt;

&lt;p&gt;Before the JAMstack was popularized, we have had a long history of static site generators. You may remember &lt;a href="https://jekyllrb.com/" rel="noopener noreferrer"&gt;Jekyl&lt;/a&gt; or &lt;a href="https://middlemanapp.com/" rel="noopener noreferrer"&gt;Middleman&lt;/a&gt; from the Ruby community. These tools allowed you to write Markdown, Liquid or Ruby’s ERB templates and generate a static site that you could host somewhere like s3 to host your blog. These tools are &lt;em&gt;&lt;em&gt;great&lt;/em&gt;&lt;/em&gt; and they are still widely used.&lt;/p&gt;

&lt;p&gt;These static site generators were great for developers that wanted to make something like a blog or a simple marketing website. Someone non-technical might reach for a tool like Wordpress or Squarespace, whereas a hacker would turn to a static site generator.&lt;/p&gt;

&lt;p&gt;For more advanced applications that went beyond statically rendered HTML, we had to switch gears away from static site generators and into a web framework like Rails.&lt;/p&gt;

&lt;p&gt;Then advanced frontend frameworks for building interactive single page applications became popular: Angular, Ember and React. Suddenly, frontend developers had all these tools and got comfortable writing React code for their applications. But for static marketing sites we couldn’t write React or Angular code because we still needed static HTML for SEO purposes and fast initial load times. Developers were stuck in a world where we wrote what we were comfortable with for our application frontend but then for our marketing site had to switch back to some ad-hoc cobbled together jQuery functions.&lt;/p&gt;

&lt;p&gt;The biggest feature that made the JAMstack popular is that you get the best of both worlds: server-side rendered HTML &lt;em&gt;plus&lt;/em&gt; interactive React components that you can do whatever you want with. This is the big innovation and the first “oh wow” moment I had using both Next.js and Gatsby. You write normal React like you’re used to, run the build process and then all of a sudden you end up with static HTML returned by the server and all your interactive React code works as you would expect.&lt;/p&gt;

&lt;h1&gt;
  
  
  Video for the JAMstack
&lt;/h1&gt;

&lt;p&gt;Mux is the video API for the JAMstack. The philosophy behind Mux and how we approach video fits in neatly with the JAMstack philosophy. Mux will act as your video infrastructure by handling the storage, hosting and delivery of your video without getting in the way or being opinionated about the presentation.&lt;/p&gt;

&lt;p&gt;In fact, Mux does not even give you a video player. You have to bring your own player to the party. The entire “frontend” of the video experience is up to you, Mux is focused on handling the backend or the “serverless” part of your video stack. Think of Mux as the headless video platform. You control every bit of the user experience while Mux does the heavy lifting behind the scenes.&lt;/p&gt;

&lt;h1&gt;
  
  
  JAMstack at Mux
&lt;/h1&gt;

&lt;p&gt;In addition to providing APIs that you can use for your JAMstack website, Mux also uses the JAMstack ourselves to power our marketing site (mux.com) and the Mux blog.&lt;/p&gt;

&lt;p&gt;A couple of months ago we finished the process of moving the Mux Blog to the JAMstack. Before this project, the Mux blog was hosted and deployed separately from mux.com. The blog was powered by an old version of Ghost, using the default Casper theme. Our marketing site is a Gatsby site that uses gatsby-source-filesystem to create some pages from markdown and gatsby-source-airtable to pull in some data from Airtable.&lt;/p&gt;

&lt;p&gt;The main issue with our existing blog that we wanted to address was that since we were using a Ghost theme, not only was the design of the blog completely different from the design of the rest of our marketing website, but it was an entirely different application with a different structure, hosting and deploy process.&lt;/p&gt;

&lt;p&gt;As a result, visitors that landed on a blog post didn’t have an easy way to get back to the main marketing site and since the look and feel didn’t exactly line up, the experience was too disconnected. We decided that we wanted to move everything to a headless CMS so that we could make the blog part of our existing Gatsby marketing site for consistency.&lt;/p&gt;

&lt;h1&gt;
  
  
  Migrating to a headless CMS
&lt;/h1&gt;

&lt;p&gt;There are pre-built Mux integrations for &lt;a href="https://www.sanity.io/" rel="noopener noreferrer"&gt;Sanity&lt;/a&gt;, &lt;a href="https://www.contentful.com/" rel="noopener noreferrer"&gt;Contentful&lt;/a&gt;, and &lt;a href="https://www.cosmicjs.com/" rel="noopener noreferrer"&gt;Cosmic&lt;/a&gt;. All of these options allow you to bring your own  Mux account. Alternatively, &lt;a href="https://www.datocms.com/" rel="noopener noreferrer"&gt;Dato&lt;/a&gt; is a headless CMS that offers native video built into the product that is &lt;a href="https://www.datocms.com/blog/why-we-chose-mux-for-datocms" rel="noopener noreferrer"&gt;powered by Mux&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;We ended up choosing Sanity as our headless CMS. We loved that Sanity felt like an open-ended developer product that could grow with our needs past just the blog today. Calling Sanity a headless CMS sells it short from what it really is: it’s more akin to a structured, real-time database. The CMS part is all open source and in your control for how you want things to look and work. The way to think about it is that Sanity provides a real-time database along with some low-level primitives to define your data model, then from there, you build your own CMS.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fh2vy1p9mo4kzbvwyjn4f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fh2vy1p9mo4kzbvwyjn4f.png" alt="Mux Sanity CMS editor"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As a part of this project of moving the blog to a new CMS, we wanted to set ourselves up with a headless CMS that could be used beyond just the blog and could also create a variety of pages on mux.com and allow us to move existing content like the &lt;a href="https://mux.com/video-glossary/" rel="noopener noreferrer"&gt;video glossary&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;For a more technical in-depth read about how we did this, check out this Sanity Guide we wrote &lt;a href="https://www.sanity.io/guides/how-to-migrate-your-html-blog-content-from-ghost" rel="noopener noreferrer"&gt;How to migrate your HTML blog-content from Ghost&lt;/a&gt; and the blog post &lt;a href="https://www.sanity.io/blog/moving-the-mux-blog-to-the-jamstack" rel="noopener noreferrer"&gt;Moving the Mux blog to the JAMstack&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>gatsby</category>
      <category>serverless</category>
    </item>
    <item>
      <title>How to Host Your Own Online Conference</title>
      <dc:creator>Dylan Jhaveri</dc:creator>
      <pubDate>Tue, 03 Mar 2020 00:01:30 +0000</pubDate>
      <link>https://dev.to/mux/how-to-host-your-own-online-conference-dk0</link>
      <guid>https://dev.to/mux/how-to-host-your-own-online-conference-dk0</guid>
      <description>&lt;p&gt;Online conferences seem to have gained in popularity the past year. We’re seeing more and more folks reach out with questions around best practices and how to pull off their own remote conference. Sometimes, people will announce a conference, get thousands of sign ups and then one or two weeks before it’s set to go live they will be figuring out how to do it; how much different from a normal video conference call could it be? For our purposes let’s say you want to build a custom experience and host an online conference on your own website.&lt;/p&gt;

&lt;p&gt;Use this guide as your playbook. This is what we will cover:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You have multiple presenters that will be broadcasting live from different locations.&lt;/li&gt;
&lt;li&gt;Your team has no video expertise at all, but you do have a technical team that is capable of building a functioning web application.&lt;/li&gt;
&lt;li&gt;The experience you want to provide to your audience is something custom that you control. Your brand is important for you and you want to control the look, feel and user experience of the conference.&lt;/li&gt;
&lt;li&gt;To broaden the reach of your conference you simultaneously want to broadcast out the video feed to social channels like Youtube Live, Facebook Live and Periscope.&lt;/li&gt;
&lt;li&gt;If possible, you would really like to have a branded overlay with your logo on the video.&lt;/li&gt;
&lt;li&gt;In addition to streaming live, you want to record the broadcast so that people who did not attend live are able to view the recordings on-demand as soon as each session is over.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ctpGo_fv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/zubcb6ssj18r9p5vg7tf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ctpGo_fv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/zubcb6ssj18r9p5vg7tf.png" alt="online conference diagram"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The basic structure of your setup is going to be a live conversation that is broadcast to a larger group of live viewers. The live conversation could be something like one person presenting with a screen share, or one person interviewing someone else, or a panel discussion among a group of experts.&lt;/p&gt;

&lt;p&gt;A really simple way to do this live conversation is to use Zoom. Most people are familiar with Zoom. It is one of the most stable, reliable and high quality pieces of meeting software around and it runs natively on your desktop. What’s really cool about Zoom is that if you enable live streaming for meetings then you can set up an RTMP output for your Zoom call to any arbitrary RTMP endpoint (this is where Mux comes in).&lt;/p&gt;

&lt;p&gt;Adding Mux in the middle is how you can broadcast your Zoom call to an audience of thousands on your own website. The live audience does not have to download Zoom, they do not interact with Zoom at all. All they do is see a video player that you make on your website.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--swhm0C-2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/30xj82zriibyz5h1a6p5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--swhm0C-2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/30xj82zriibyz5h1a6p5.png" alt="Live conference with Mux"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Let’s break down the steps and API calls:
&lt;/h2&gt;

&lt;p&gt;1) Make sure in Zoom settings you have allowed meetings to be live streamed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zOX-c9jh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/58jcspgx91w4s4632fu2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zOX-c9jh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/58jcspgx91w4s4632fu2.png" alt="Zoom allow live"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;2) &lt;a href="https://docs.mux.com/reference#create-a-live-stream"&gt;Create a Mux live stream&lt;/a&gt; - this is one API call and for every live stream you create you will get back a unique stream key also make sure you save a &lt;code&gt;playback_id&lt;/code&gt; for this live stream - you will need this to play the live stream.&lt;/p&gt;

&lt;p&gt;3) Set up a Zoom call like you normally would. From the call, click the 3 dots at the bottom where it says "More" and click "Live on Custom Live Streaming Service".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lD5gvX3A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/2hdsjod507zltt0pwmry.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lD5gvX3A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/2hdsjod507zltt0pwmry.png" alt="Zoom enable live stream"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;4) From here, enter the RTMP server details for Mux.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--VTCVSJOC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/09fqxzbl1v5b4qyer0ir.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--VTCVSJOC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/09fqxzbl1v5b4qyer0ir.png" alt="Zoom enter RTMP details"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;5) When you’re ready to stream click “Go Live!” from Zoom. Now your Zoom call will be live and Mux will start receiving the video and audio. To confirm that this part is working, navigate to the live stream in your Mux dashboard and you should see video and audio coming in. Later you can set up &lt;a href="https://docs.mux.com/docs/live-streaming#section-broadcasting-webhooks"&gt;Webhooks&lt;/a&gt; so that you can be notified when every live stream is connected, active, completed, etc.&lt;/p&gt;

&lt;p&gt;6) Next is the player side, you have the &lt;code&gt;playback_id&lt;/code&gt; from step 3, right? You will need to take that ID and form a URL like this: &lt;code&gt;https://stream.mux.com/{playback-id}.m3u8&lt;/code&gt;. This is an “m3u8” URL which is a URL for streaming video over HLS. HLS is a standard streaming format for both live and on-demand video. You will need to use this HLS URL in a video player. Which player you choose is entirely up to you. Here’s two free ones you can check out to get started: &lt;a href="https://videojs.com/"&gt;videojs&lt;/a&gt; and &lt;a href="https://plyr.io/"&gt;plyr&lt;/a&gt;. With whichever player you choose, follow the instructions for streaming a HLS video.&lt;/p&gt;

&lt;p&gt;Note that HLS streaming will come with some latency. Expect for 15-20 seconds, there is a &lt;code&gt;reduced_latency&lt;/code&gt; flag that you can use which will bring that number down closer to 8 or 10 seconds (with some tradeoffs). Check the &lt;a href="https://docs.mux.com/reference#create-a-live-stream"&gt;docs here&lt;/a&gt; for more information.&lt;/p&gt;

&lt;h2&gt;
  
  
  Live chat
&lt;/h2&gt;

&lt;p&gt;After you get the live stream of your conference working on your webpage, you will almost certainly want to add a live chat component. Live chat is outside the scope of what Mux offers, but we’ve seen this done in a few different ways:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Use your own database and push real time changes to your clients with something like &lt;a href="https://pusher.com/"&gt;Pusher&lt;/a&gt;, &lt;a href="https://www.pubnub.com/"&gt;PubNub&lt;/a&gt; or &lt;a href="https://socket.io/"&gt;Socket.IO&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Use a realtime database like Firebase and build your own chat experience.&lt;/li&gt;
&lt;li&gt;Try something like &lt;a href="https://getstream.io/"&gt;Stream&lt;/a&gt; which offers real time APIs specifically around creating chat experience. Fully featured with things like uploading images to chat, emoji reactions, typing notifications and all the bells and whistles are built in.&lt;/li&gt;
&lt;li&gt;Skip the step of adding chat on your webpage and create a Slack community where everyone can chat. The benefit of having a slack community is that it’s free and you don’t have to go through the steps of building chat onto your website. Most people are already familiar with Slack, so they likely have it downloaded already. You can also create channels for specific topics and allow attendees to DM each other. This has the added benefit of allowing for the kind of attendee networking that happens at in-person conferences.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Everything is recorded
&lt;/h2&gt;

&lt;p&gt;Every live stream that you do will be recorded by Mux and the asset will be available for on-demand playback. After each live stream is over, you can use the Mux asset to give attendees who were not present live the ability to view the recording.&lt;/p&gt;

&lt;h2&gt;
  
  
  Bonus: simulcast to the socials
&lt;/h2&gt;

&lt;p&gt;When you &lt;a href="https://docs.mux.com/reference#create-a-live-stream"&gt;create a Mux live stream&lt;/a&gt; you can optionally add &lt;code&gt;simulcast_targets&lt;/code&gt; which are arbitrary RTMP endpoints that Mux will push out your stream to. It’s fairly straightforward and we have some guides on how to do this. Read more in &lt;a href="https://mux.com/blog/seeing-double-let-your-users-simulcast-a-k-a-restream-to-any-social-platform/"&gt;the announcement blog post&lt;/a&gt; and &lt;a href="https://mux.com/blog/help-your-users-be-in-5-places-at-once-your-guide-to-simulcasting/"&gt;the guide&lt;/a&gt;. All you really have to do is track down the RTMP server URL and stream keys for each of the social networks you want to broadcast to and add them to the Mux live stream with an API call.&lt;/p&gt;

&lt;h2&gt;
  
  
  Bonus: add a watermark
&lt;/h2&gt;

&lt;p&gt;In the &lt;code&gt;new_asset_settings&lt;/code&gt; parameter when you create the live stream, you have the option to specify a watermark. You give Mux the URL to an image you want to use as a watermark and some details about where to place it and how to align it. When this is configured, Mux will add it to the stream that comes from Zoom and the watermark will appear on the HLS stream that you show on your website.&lt;/p&gt;

&lt;h2&gt;
  
  
  You don’t have to use Zoom
&lt;/h2&gt;

&lt;p&gt;Zoom is the simple example I used here because many people are familiar with it and know how it works. But the reality is you can use &lt;em&gt;&lt;em&gt;any software&lt;/em&gt;&lt;/em&gt; that allows you to RTMP out to Mux. To name a few other options: &lt;a href="https://obsproject.com/"&gt;OBS&lt;/a&gt;, &lt;a href="https://www.telestream.net/wirecast/"&gt;Wirecast&lt;/a&gt;, &lt;a href="https://www.ecamm.com/mac/ecammlive/"&gt;ecamm live&lt;/a&gt;, all of these products are built to compose a single video stream and RTMP out to somewhere. All of these will work with Mux.&lt;/p&gt;

&lt;h2&gt;
  
  
  Final Setup
&lt;/h2&gt;

&lt;p&gt;Here’s what your final setup might look like. If you are going to host an online conference with Mux, please reach out! We would love to talk to you and help you make sure it’s successful.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NQGzmIKt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/fx3fbjk4jc7326eipzm1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NQGzmIKt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/fx3fbjk4jc7326eipzm1.png" alt="live conference Mux simulcast"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Extra details if you’re curious
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Mux supports live streams for up to 12 hours. If you have one single stream of an all day conference then this should be enough.&lt;/li&gt;
&lt;li&gt;There are two options for playing the live stream on your website, you can either use a &lt;code&gt;playback_id&lt;/code&gt; associated directly with the live stream, OR you can use the &lt;code&gt;playback_id&lt;/code&gt; associated with the &lt;code&gt;active_asset&lt;/code&gt; that is associated with the live stream. The former will not allow seeking backwards in the stream, the latter will allow your attendees (if they want) to seek all the way back to the beginning. This is a subtle detail but it might be something you want to consider.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you’re doing an online conference, please get in touch!&lt;/p&gt;

</description>
      <category>video</category>
    </item>
    <item>
      <title>&lt;video autoplay&gt; Considered Harmful</title>
      <dc:creator>Dylan Jhaveri</dc:creator>
      <pubDate>Thu, 23 Jan 2020 17:19:28 +0000</pubDate>
      <link>https://dev.to/mux/video-autoplay-considered-harmful-52d6</link>
      <guid>https://dev.to/mux/video-autoplay-considered-harmful-52d6</guid>
      <description>&lt;p&gt;If you’re trying to autoplay videos on the web, you might be tempted to reach for the &lt;a href="https://www.w3schools.com/tags/att_video_autoplay.asp"&gt;HTML5 autoplay attribute&lt;/a&gt;. This sounds exactly like what you’re looking for, right? Well, not exactly. Let’s talk about why that’s probably not what you’re looking for and what the better option is.&lt;/p&gt;

&lt;h2&gt;
  
  
  Browsers will block your autoplay attempts
&lt;/h2&gt;

&lt;p&gt;Over the last few years, all major browser vendors have taken steps to aggressively block autoplaying videos on webpages. Safari announced some policy changes in &lt;a href="https://webkit.org/blog/7734/auto-play-policy-changes-for-macos/"&gt;June 2017&lt;/a&gt; and &lt;a href="https://developers.google.com/web/updates/2017/09/autoplay-policy-changes"&gt;Chrome followed suit&lt;/a&gt; shortly after &lt;a href="https://support.mozilla.org/en-US/kb/block-autoplay"&gt;and Firefox after that&lt;/a&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;In summary: all these browsers will aggressively block videos from autoplaying on webpages. Each browser has slightly different rules around how it makes this decision. It’s a huge black box and browsers will not tell you what their exact rules are. The default behavior is block most autoplay attempts.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;There are however some conditions that make it more likely for autoplay to work:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Your video is muted with the muted attribute.&lt;/li&gt;
&lt;li&gt;The user has interacted with the page with a click or a tap.&lt;/li&gt;
&lt;li&gt;(Chrome - desktop) The user’s &lt;a href="https://developers.google.com/web/updates/2017/09/autoplay-policy-changes#mei"&gt;Media Engagement Index&lt;/a&gt; threshold has been crossed. Chrome keeps track of how often a user consumes media on a site and if a user has played a lot of media on this site then Chrome will probably allow autoplay.&lt;/li&gt;
&lt;li&gt;(Chrome - mobile) The user has added the site to their home screen.&lt;/li&gt;
&lt;li&gt;(Safari) Device is not in power-saving mode.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These conditions only make autoplay more likely, but remember that aside from these conditions, the user can override the browser’s default setting on a per-domain basis. This basically means that you can never rely on autoplay actually working.&lt;/p&gt;

&lt;h2&gt;
  
  
  Autoplay will probably work for you, but it will break for your users
&lt;/h2&gt;

&lt;p&gt;Even if you try to follow the rules above, autoplay is still a finicky beast. One thing to keep in mind (for Chrome at least) is that due to Chrome’s Media Engagement Index. When you are testing autoplay on your own site it will probably work for you (because you visit your site often and play content, your MEI score is high). But then when new users come to your site, it is likely to fail (because their MEI score is low). As a developer, this is incredibly frustrating and another reason to always avoid the autoplay attribute.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--AsEQMD9K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/3453zr9gg72dikggiydl.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--AsEQMD9K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/3453zr9gg72dikggiydl.jpg" alt="works on my machine"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;What should I do instead?&lt;br&gt;
I’m not suggesting that you avoid autoplaying videos, but I am suggesting that you always avoid the autoplay attribute. There is a better way.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Use &lt;code&gt;video.play()&lt;/code&gt; in javascript world, which returns a promise. If the promise resolves, then autoplay worked, if the promise rejects then autoplay was blocked.&lt;/li&gt;
&lt;li&gt;If the promise returned from &lt;code&gt;video.play()&lt;/code&gt; rejects, then show a play button in the UI so that the user can click to play (the default video &lt;code&gt;controls&lt;/code&gt; attribute will work just fine). If you are using your own custom controls and your javascript calls &lt;code&gt;video.play()&lt;/code&gt; again as the result of an event that bubbled up from a user click, then it will work.&lt;/li&gt;
&lt;li&gt;Consider starting with the video muted, this gives you a much lower chance of your &lt;code&gt;video.play()&lt;/code&gt; call rejecting. You will want to show some kind of “muted” icon in the UI that the user can click to unmute (again, the default video &lt;code&gt;controls&lt;/code&gt; attribute works great for that). You may notice that twitter and a lot of sites start videos in the muted state.&lt;/li&gt;
&lt;li&gt;Have I mentioned showing controls for your player? Always make sure controls for your player are accessible. We have seen sites try to get fancy and be too minimalist by hiding controls. Inevitably, they run into situations where autoplay fails and the user has no way of clicking to make the video play. Make sure you do not fall into this trap.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Here’s an example with vanilla javascript
&lt;/h2&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// get a reference to a &amp;lt;video&amp;gt; element&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;videoEl&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;document&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;querySelector&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;video&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;// attempt to call play() and catch if it fails&lt;/span&gt;
&lt;span class="nx"&gt;videoEl&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;play&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nx"&gt;then&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Autoplay success!&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}).&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;warning&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Autoplay error&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;h2&gt;
  
  
  Here’s an example with React
&lt;/h2&gt;

&lt;p&gt;If you look at mux.com you’ll see that we autoplay a video on the top of the home page.  I copied over how we did that and set up a demo here: &lt;a href="https://o9s4w.csb.app/"&gt;https://o9s4w.csb.app/&lt;/a&gt;. The code is copied below and you can fork it and play around with the &lt;a href="https://codesandbox.io/s/autoplay-example-react-o9s4w"&gt;Code Sandbox&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Notice that we’re doing a few things here:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Try to call &lt;code&gt;video.play()&lt;/code&gt; when the component loads.&lt;/li&gt;
&lt;li&gt;Show the user a play/pause state in the UI by using the default &lt;code&gt;controls&lt;/code&gt; attribute.&lt;/li&gt;
&lt;li&gt;Start with the video in the &lt;code&gt;muted&lt;/code&gt; state. Our video does not have audio, but if it did you would still want to start off in the muted state with the muted attribute, and show a mute/unmute icon in the UI.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;React&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;useEffect&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;useRef&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;react&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;./styles.css&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nx"&gt;App&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;videoEl&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;useRef&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;attemptPlay&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;videoEl&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt;
      &lt;span class="nx"&gt;videoEl&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;current&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt;
      &lt;span class="nx"&gt;videoEl&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;play&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Error attempting to play&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;

  &lt;span class="nx"&gt;useEffect&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;attemptPlay&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="p"&gt;[]);&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;div&lt;/span&gt; &lt;span class="nx"&gt;className&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;App&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;h1&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="nx"&gt;Autoplay&lt;/span&gt; &lt;span class="nx"&gt;example&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="sr"&gt;/h1&lt;/span&gt;&lt;span class="err"&gt;&amp;gt;
&lt;/span&gt;      &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;div&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;video&lt;/span&gt;
          &lt;span class="nx"&gt;style&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="na"&gt;maxWidth&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;100%&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;width&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;800px&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;margin&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;0 auto&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;
          &lt;span class="nx"&gt;playsInline&lt;/span&gt;
          &lt;span class="nx"&gt;loop&lt;/span&gt;
          &lt;span class="nx"&gt;muted&lt;/span&gt;
          &lt;span class="nx"&gt;controls&lt;/span&gt;
          &lt;span class="nx"&gt;alt&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;All the devices&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
          &lt;span class="nx"&gt;src&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;https://stream.mux.com/6fiGM5ChLz8T66ZZiuzk1KZuIKX8zJz00/medium.mp4&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
          &lt;span class="nx"&gt;ref&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;videoEl&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="sr"&gt;/&lt;/span&gt;&lt;span class="err"&gt;&amp;gt;
&lt;/span&gt;      &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="sr"&gt;/div&lt;/span&gt;&lt;span class="err"&gt;&amp;gt;
&lt;/span&gt;    &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="sr"&gt;/div&lt;/span&gt;&lt;span class="err"&gt;&amp;gt;
&lt;/span&gt;  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



</description>
    </item>
    <item>
      <title>No BART terminals were hacked in the making of this ad</title>
      <dc:creator>Dylan Jhaveri</dc:creator>
      <pubDate>Tue, 21 Jan 2020 22:55:36 +0000</pubDate>
      <link>https://dev.to/mux/no-bart-terminals-were-hacked-in-the-making-of-this-ad-1efo</link>
      <guid>https://dev.to/mux/no-bart-terminals-were-hacked-in-the-making-of-this-ad-1efo</guid>
      <description>&lt;p&gt;Originally posted by my colleague Bonnie Pecevich on &lt;a href="https://mux.com/blog/no-bart-terminals-were-hacked-in-the-making-of-this-ad"&gt;mux.com/blog&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the process of creating a BART ad for the first time, we had some learnings that we thought we would share that could hopefully help someone else on their out-of-home ad buying journey. (We’ll also remember to follow our own advice for next time.)&lt;/p&gt;

&lt;p&gt;Our learnings:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Ask upfront for explicit restrictions on creative.&lt;/li&gt;
&lt;li&gt;Build in extra time for more than one round of feedback and time to iterate on design.&lt;/li&gt;
&lt;li&gt;Be realistic about what’s feasible, especially with an aggressive timeline.&lt;/li&gt;
&lt;li&gt;Submit a draft of the concept and see if they’ll approve it before you spend extra time finalizing the details (and telling everyone about it at the company all hands meeting.)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;All of these learnings actually stemmed from one preeminent learning:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;BART doesn’t allow any &lt;code&gt;code&lt;/code&gt; on their ads. 😲&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Why put code in an ad?
&lt;/h2&gt;

&lt;p&gt;First, an introduction–Mux is a startup that does video, and one of our aspirational goals is for every developer to know that. With our headquarters located in San Francisco, we’re aware that our city has a great supply of developers so we thought we’d try advertising in some well-traveled, public spaces.&lt;/p&gt;

&lt;p&gt;Doing ads in a BART station (the underground transit system in the Bay Area) is generally assumed to be expensive, maybe even beyond the reach of a startup which is what we thought, too. But we learned doing an ad could fit in our budget if we were flexible on timing–we were able to sign up for a single digital display at the Montgomery BART station with a 12/30/19 start date. Even though that only gave us about 2 weeks to create an ad (ignore the wailing coming from our one and only in-house designer), we were excited!&lt;/p&gt;

&lt;p&gt;Since the ad is just :15 seconds long with a not-so-captive audience, we wanted to create something that quickly caught the attention of developers. We thought we could achieve this by showing a terminal with a blinking cursor and then typed code to show use of our API. Sure, it crossed our minds that a blank screen with a blinking cursor might look like the screen is broken (which adds to the eye-catching-ness), so we added browsers to frame the terminal and added our logo to the top left corner. Our hope was that someone would take away that Mux is for developers and, if we were lucky, that we do something with video.&lt;/p&gt;

&lt;h2&gt;
  
  
  Insert wrench here
&lt;/h2&gt;

&lt;p&gt;The process was to submit a final file at least a week in advance of the live date to include time to get BART’s approval. There weren’t any specific guidelines beforehand on what’s allowed and what’s not but we assumed some common sense restrictions would apply like no explicit/harmful language imagery, etc. We figured getting BART’s approval would be relatively simple, like checking a box.&lt;/p&gt;

&lt;p&gt;Wrong. Our ad was rejected! We received feedback that the beginning of the ad that showed the terminal could give the impression that “the screen is malfunctioning or has been hacked into.”&lt;/p&gt;

&lt;p&gt;Busted. Turns out they also thought having a terminal on the screen would be eye-catching but not in a good way. We did feel a bit deflated, though, as we were all ready for our BART debut.&lt;/p&gt;

&lt;p&gt;We went through the five stages of grief and settled on “Bargaining.” We tried to come up with a creative solution where we could still use the same ad. Hey, what if we could add a persistent banner to the ad that said something like “Don’t worry, no BART terminals were hacked in the making of this ad.”?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5tOrn7XL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/uvhwgosi5e7dq6zwc1px.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5tOrn7XL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/uvhwgosi5e7dq6zwc1px.jpg" alt="No hack banner"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Or what if we stylized the terminal so it looked more illustrated and cartoon-y?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SyM6DyUd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/qbing997fzrdw4323v9e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SyM6DyUd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/qbing997fzrdw4323v9e.png" alt="Cartoon code ad"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Alas, BART held firm and said, in no uncertain terms, &lt;strong&gt;“Nothing involving coding.”&lt;/strong&gt; Since we couldn’t come up with a brand new design in 48 hours, our plans for a BART ad had to be put on hold.&lt;/p&gt;

&lt;h2&gt;
  
  
  Silver lining
&lt;/h2&gt;

&lt;p&gt;All is not lost! We used the final video for &lt;a href="https://mux.com/"&gt;our homepage&lt;/a&gt; and are genuinely excited at how it came out.&lt;/p&gt;

&lt;p&gt;Although the BART approval process is still a bit of a black box, we're excited to continue to work with the same ad agency and pursue our out-of-home ad dreams. We’re looking forward to iterating our design and hopefully making a public appearance at CalTrain in the very near future. And if you see our ad, you’ll know the journey it took to get that little video up on those screens.&lt;/p&gt;

</description>
      <category>devrel</category>
    </item>
    <item>
      <title>In defense of 'flicks' (or how I learned to stop worrying and love 705600000)</title>
      <dc:creator>Dylan Jhaveri</dc:creator>
      <pubDate>Tue, 26 Nov 2019 18:26:56 +0000</pubDate>
      <link>https://dev.to/mux/in-defense-of-flicks-or-how-i-learned-to-stop-worrying-and-love-705600000-dk6</link>
      <guid>https://dev.to/mux/in-defense-of-flicks-or-how-i-learned-to-stop-worrying-and-love-705600000-dk6</guid>
      <description>&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt; originally posted by my colleague on &lt;a href="https://mux.com/blog/in-defense-of-flicks-or-how-i-learned-to-stop-worrying-and-love-705600000/"&gt;the mux blog&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;About 2 years ago, the Oculus VR division of THE FACEBOOK created a &lt;a href="https://github.com/OculusVR/Flicks"&gt;project they called 'flicks'&lt;/a&gt;. Essentially a flick is just a really big number, specifically the number 705,600,000. This project was picked up by some news outlets like &lt;a href="https://techcrunch.com/2018/01/22/facebook-invented-a-new-time-unit-called-the-flick-and-its-truly-amazing/"&gt;TechCrunch&lt;/a&gt;, &lt;a href="https://www.theverge.com/tldr/2018/1/22/16920740/facebook-unit-of-time-flicks-frame-rate-ticks-github-nanosecond-second"&gt;The Verge&lt;/a&gt;, and the &lt;a href="https://www.bbc.com/news/technology-42787529"&gt;BBC&lt;/a&gt; and seems to cause some confusion and even some ridicule. To be fair, a news article about a number is a bit odd. If you’re not an engineer working with digital media, the idea behind this number is difficult to grasp. And to those who do work in digital media, the number seems to not offer anything new. It purports to solve a problem that nobody in the industry actually has. So where did it come from, and why does it exist? Let’s back up…&lt;/p&gt;

&lt;p&gt;Time is a surprisingly difficult concept in digital media. For starters, we are dealing with time values that are very small and difficult to imagine. Recently I saw the movie Gemini Man at the AMC Metreon here in San Francisco. It was one of the few theaters capable of playing the 120 frames per second version, whereas most films are 24 frames per second. At 120fps, every frame is projected for just over 0.00833 seconds before flashing the next one– a very short period of time. But compared to digital audio, this is an eternity. Audio recorded at 44100hz has a sample every 0.000022675 seconds. That is 367.5 times more audio samples than video frames. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PnrqbosK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/fvnik5s7n05h8oyoue5a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PnrqbosK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/fvnik5s7n05h8oyoue5a.png" alt="Example audio visual timeline"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Example audio visual timeline&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;These numbers with fractional components (a decimal point) are known as “floating point” numbers in computer science and computers are surprisingly bad at dealing with them. Above when I said every video frame was on the screen for 0.00833 seconds, that was not exactly true. When you divide 1 by 120 The number result is 0.008333333333 with the 3 repeating forever. For a computer to store a number that repeats forever would require an infinite amount of memory, so the number is approximated. The difference between the approximation and the actual number results in tiny errors in the math. These small errors can add up over time and become big errors and could result in problems such as audio and video becoming out of sync. Using a unit of time like milliseconds or nanoseconds would help, but would only delay the problem and not solve it.&lt;/p&gt;

&lt;p&gt;The ultimate solution is to not record time in seconds but instead as an integer number of fractional units. For example &lt;code&gt;1000 x 1 ÷ 120&lt;/code&gt; is the 1000th frame of a 120fps video. Converting to seconds we still end up with a floating point number but as long as we count frames as integers the error does not accumulate over time. If you’re following the math closely you may have noticed that while solving this problem we have created another one. &lt;/p&gt;

&lt;p&gt;What frame should we render first? The video frame at &lt;code&gt;1000 x 1 ÷ 120&lt;/code&gt;, or the audio sample at &lt;code&gt;367500 x 1 ÷ 44100&lt;/code&gt;? We need to convert to a common time base to know for sure. We could convert to seconds then compare, but that brings us back again to the floating point problem. By using the “least common multiple” or LCM, of the ratios, 88,200 in this case, we can convert these fractions to a common time base at which point we can compare them. &lt;code&gt;88,200 ÷ 44100 x 367500 = 735,000&lt;/code&gt;, and &lt;code&gt;88,200 ÷ 120 x 1000 = 735,000&lt;/code&gt;. These time stamps are exactly the same and should be rendered together to ensure sync. At no time did we need to use floating point math, which may have given us a slightly different answer.&lt;/p&gt;

&lt;p&gt;In the world of digital media there are some time bases that come up very frequently. As stated, film commonly uses 24 fps, European television (and other &lt;a href="https://en.wikipedia.org/wiki/PAL"&gt;PAL&lt;/a&gt; countries) use 25 fps, and for obscure reasons, American television (and other &lt;a href="https://en.wikipedia.org/wiki/NTSC"&gt;NTSC&lt;/a&gt; countries) use 29.97 fps. Wait! Floating point numbers again? Actually no, because it’s not really 29.97fps. It’s actually &lt;code&gt;30000 ÷ 1001&lt;/code&gt; fps. &lt;/p&gt;

&lt;p&gt;Here is where flicks come in. You see &lt;code&gt;705600000 ÷ 44100 = 16000&lt;/code&gt; EXACTLY, &lt;code&gt;705600000 ÷ 120 = 5,880,000&lt;/code&gt; EXACTLY, and even &lt;code&gt;1001 x 705600000 ÷ 30000 = 23,543,520&lt;/code&gt; EXACTLY. This is why the flick is interesting, it has a special property of being the least common multiple of many of the commonly used timebases in digital media.&lt;/p&gt;

&lt;p&gt;We now know what the flick is, Buy why? We have established that if we record every time stamp as 3 integers, a numerator, a denominator and a multiplier, we don’t need a common base since we can convert between them as needed. There are two primary reasons. First is efficiency. If we know we will need to compare a lot of time stamps in different time bases, or compare the same timestamp multiple times, converting them to a common base can be faster for a computer. Once converted to a common base, comparing two numbers is probably the fastest operation a computer can do. Whereas comparing two fractions requires an algorithm to find a common base, convert the value, and then compare. But the motivation cited in the flicks github page is slightly different, and caused by a design decision in the C++ programing language.&lt;/p&gt;

&lt;p&gt;Computers are pretty good at dealing with time, but humans are really bad at it. In most parts of the United States, once a year for daylight saving, we have a 25 hour, only to have a 23 hour day a few months later. Every 4 years, we get an extra day at the end of February, unless the year is divisible by 100; except when the year is also evenly divisible by 400 then it is a leap year (there was no leap day in the year 1900). We even have leap seconds, where we add an extra second to a year whenever we notice that the atomic clocks don't quite agree with astronomical observations. Meanwhile a computer's clock needs to keep moving forward one second per second otherwise bad things happen.&lt;/p&gt;

&lt;p&gt;To help with this human nonsense and standardize how to manage time, C++11 added a new package called chrono to its standard library. Because the language designers were smart, the &lt;a href="https://en.cppreference.com/w/cpp/chrono/duration"&gt;std::chrono::duration&lt;/a&gt; time type included support for the time as a ratio technique we have established. Perfect! Well... not so fast. Because the language designers were unwilling to give up more cpu cycles and slow down programs (C++’s defining feature is speed after all), it was decided that the time base must be known in advance while writing the program (compile time). This allows for fast running programs because the fractions can be ignored when they are known to be equal, but it sacrifices automatic time base conversions when they are not. Herein lies the problem. When playing back a media file we can’t know the time base in advance, we haven't seen the file yet. What we need is a time base that can support any media we are likely to encounter. Enter flicks. An elegant solution to a problem only a handful of media engineers will ever encounter that happened to be announced on a slow news day.&lt;/p&gt;

&lt;p&gt;Flicks does not seem to be in wide use. I used it at Mux in one specific place in our transcoding pipeline with its intended purpose. I used it with a C++11 program where utilizing std::chrono made things a bit easier to standardize on. But searching GitHub and Google, I could only find a handful of places where it is used in the wild, and I really don't expect that to change.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Using Netlify Functions to Create Signing Tokens</title>
      <dc:creator>Dylan Jhaveri</dc:creator>
      <pubDate>Thu, 07 Nov 2019 20:38:31 +0000</pubDate>
      <link>https://dev.to/mux/using-netlify-functions-to-create-signing-tokens-25i6</link>
      <guid>https://dev.to/mux/using-netlify-functions-to-create-signing-tokens-25i6</guid>
      <description>&lt;p&gt;Have you used cloud functions yet? They come in many flavors: Amazon Lambda, Cloudflare Workers, Zeit Serverless Functions, and the one we’re using here: Netlify Functions.&lt;/p&gt;

&lt;p&gt;Cloud functions are an essential component of the JAMstack. I had not heard the name JAMstack until recently. For the uninitiated (like me) it stands for Javascript, APIs and Markup. You may have seen JAMstack technologies like Gatsby, Next.js and tools of this nature that focus on performance, new developer tooling, and leveraging CDNs to serve pre-compiled HTML pages. I will be at &lt;a href="https://jamstackconf.com/sf/"&gt;JAMstack Conf 2019 in SF&lt;/a&gt;, if you will be there too, then come find me and say hi!&lt;/p&gt;

&lt;p&gt;All the code in this post is open source here on GitHub in our examples repo: &lt;a href="https://github.com/muxinc/examples/tree/master/signed-playback-netlify"&gt;muxinc/examples/signed-playback-netlify&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Main Benefits of Cloud Functions
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Run code close to your clients (clients can be browsers, mobile apps, internet-of-things devices, self-driving cars, drones, anything that is talking to your server). Like a CDN, cloud functions are deployed to edge data centers to minimize latency between your clients and the server that runs their code.&lt;/li&gt;
&lt;li&gt;Protect your origin servers from being flooded with traffic. Cloud functions are a good way to cache data and intercept requests and respond to your users before they reach your origin servers. This means less bandwidth and CPU that your origin servers have to process.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Cloud functions, like Netlify Functions, might be a good option for you if you are using Mux’s Signed URLs feature.&lt;/p&gt;

&lt;h2&gt;
  
  
  A Little Background About Signed Urls
&lt;/h2&gt;

&lt;p&gt;When you create a video asset via Mux’s &lt;code&gt;POST /video&lt;/code&gt; API you can also create a Playback ID (&lt;a href="https://docs.mux.com/docs/video"&gt;Mux API docs&lt;/a&gt;) and specify the &lt;code&gt;playback_policy&lt;/code&gt; as either &lt;code&gt;"public"&lt;/code&gt; or &lt;code&gt;"signed"&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;A “public” playback policy can be played back on any site, in any player and does not have an expiration. A “signed” playback policy requires that when the playback URL is requested from a player, it has to be accompanied by a “token” param that is generated and signed on your server.&lt;/p&gt;

&lt;p&gt;This is how it looks:&lt;/p&gt;

&lt;p&gt;public playback URL:&lt;br&gt;
&lt;code&gt;https://stream.mux.com/${playbackId}.m3u8&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;signed playback URL:&lt;br&gt;
&lt;code&gt;https://stream.mux.com/${playbackId}.m3u8?token=${token}&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;token&lt;/code&gt; param is what you need to create on your server in order for the playback URL to work.&lt;/p&gt;
&lt;h2&gt;
  
  
  Create a Mux Asset
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Sign up for a &lt;a href="https://mux.com"&gt;mux.com&lt;/a&gt; account (free account comes with $20 credit)&lt;/li&gt;
&lt;li&gt;Go to &lt;a href="https://mux.com/settings/access-tokens"&gt;settings/access-tokens&lt;/a&gt; and click “Generate new token” to create a token you can use for API calls&lt;/li&gt;
&lt;li&gt;Copy your token id (we'll call this &lt;code&gt;MUX_TOKEN_ID&lt;/code&gt;) and secret (&lt;code&gt;MUX_TOKEN_SECRET&lt;/code&gt;). You will need these to make two api calls.&lt;/li&gt;
&lt;li&gt;Create a Mux video asset with a “signed” playback policy
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl https://api.mux.com/video/v1/assets \
  -X POST \
  -H "Content-Type: application/json" \
  -u ${MUX_TOKEN_ID}:${MUX_TOKEN_SECRET} \
  -d '{ "input": "https://storage.googleapis.com/muxdemofiles/mux-video-intro.mp4", "playback_policy": "signed" }' 
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Copy the &lt;code&gt;playback_id&lt;/code&gt; from the response, you will need this later.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Create a URL Signing Key
&lt;/h2&gt;


&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl https://api.mux.com/video/v1/signing-keys \
  -X POST \
  -H "Content-Type: application/json" \
  -u ${MUX_TOKEN_ID}:${MUX_TOKEN_SECRET}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Copy the &lt;code&gt;id&lt;/code&gt; (&lt;code&gt;MUX_TOKEN_ID&lt;/code&gt;) and the &lt;code&gt;private_key&lt;/code&gt; (&lt;code&gt;MUX_PRIVATE_KEY&lt;/code&gt;) from the response, you will need these later. These are the keys you will need to create signed urls for playback.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Setup a Netlify project
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Create a new directory for your project &lt;code&gt;mkdir netlify-mux-signing &amp;amp;&amp;amp; cd netlify-mux-signing&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Install the Netlify CLI and run &lt;code&gt;netlify init&lt;/code&gt; to create a new project. You can choose to connect Netlify to a GitHub repository.&lt;/li&gt;
&lt;li&gt;If you’re starting from scratch, run &lt;code&gt;yarn init&lt;/code&gt; to create an empty &lt;code&gt;package.json&lt;/code&gt; and &lt;code&gt;git init&lt;/code&gt; to make this a git repository.&lt;/li&gt;
&lt;li&gt;Now you have a barebones project that is connected to Netlify, but nothing is in it yet (you can see there is a hidden and gitignored directory called &lt;code&gt;.netlify&lt;/code&gt; which Netlify uses to handle deploys and Netlify commands&lt;/li&gt;
&lt;li&gt;Run &lt;code&gt;yarn add netlify-lambda&lt;/code&gt; to install the netlify-lambda package into your project (it’s recommended to install this locally instead of globally).&lt;/li&gt;
&lt;li&gt;Run  &lt;code&gt;yarn add @mux/mux-node&lt;/code&gt; to add the Mux node SDK to your project&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  Step 1: Create a module to generate a signing token
&lt;/h3&gt;

&lt;p&gt;Create a &lt;code&gt;src/&lt;/code&gt; folder in your project and let’s create a small module called &lt;code&gt;mux_signatures.js&lt;/code&gt;. It will export one function called signPlaybackId which takes a playback id and returns a token that is generated with &lt;code&gt;Mux.JWT.sign&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// ./src/mux_signatures&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;Mux&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@mux/mux-node&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;signPlaybackId&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;playbackId&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;Mux&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;JWT&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;sign&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;playbackId&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;keyId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;MUX_SIGNING_KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;keySecret&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;MUX_PRIVATE_KEY&lt;/span&gt;
  &lt;span class="p"&gt;})&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Our lambda function is going to use this module in Step 2.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Create a &lt;code&gt;sign_playback_id&lt;/code&gt; cloud function
&lt;/h3&gt;

&lt;p&gt;Create a Netlify Function entry point. This is the single function that will handle one request. The idomatic pattern for creating cloud functions is to do one file and one javascript function per route. We will create a directory called &lt;code&gt;functions/&lt;/code&gt; and add a file called &lt;code&gt;sign_playback_id.js&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// ./functions/sign_playback_ids.js&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;keySecret&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;MUX_PRIVATE_KEY&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;signPlaybackId&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;./src/mux_signatures&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="nx"&gt;exports&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;handler&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;queryStringParameters&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;playbackId&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;queryStringParameters&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;playbackId&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;statusCode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;400&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="na"&gt;errors&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[{&lt;/span&gt;&lt;span class="na"&gt;message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Missing playbackId in query string&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;}]})&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;token&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;signPlaybackId&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;playbackId&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt;  &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;statusCode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;302&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Access-Control-Allow-Origin&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;*&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;location&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`https://stream.mux.com/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;playbackId&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;.m3u8?token=&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;token&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;''&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;statusCode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;500&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;errors&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[{&lt;/span&gt;&lt;span class="na"&gt;message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Server Error&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;}]&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 3: Add netlify.toml
&lt;/h3&gt;

&lt;p&gt;Add a &lt;code&gt;netlify.toml&lt;/code&gt; file to the root directory and tell Netlify where your functions will live. This tells Netlify that before we deploy we are going to build our functions into the &lt;code&gt;./.netlify/functions&lt;/code&gt; directory&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    [build]
      functions = "./.netlify/functions"
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 4: Connect to git
&lt;/h3&gt;

&lt;p&gt;In order to use Netlify Functions you will now need to commit your code and push it up to a git repository like GitHub. Do that next and in Netlify’s dashboard connect your git repository to the Netlify project that you created. After connecting you git repository then &lt;/p&gt;

&lt;h3&gt;
  
  
  Step 5: Set your environment variables
&lt;/h3&gt;

&lt;p&gt;In your Netlify project dashboard, naviate to "Settings" &amp;gt; "Deploys" &amp;gt; “Environment” to set your environment variables. Enter the &lt;code&gt;MUX_SIGNING_KEY&lt;/code&gt; and &lt;code&gt;MUX_PRIVATE_KEY&lt;/code&gt; from the Create a URL Signing Key step above.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 6: Test in development
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Open one terminal and run &lt;code&gt;netlify dev&lt;/code&gt; this will start a local Netlify dev server&lt;/li&gt;
&lt;li&gt;Open another terminal window and run &lt;code&gt;netlify-lambda serve ./functions&lt;/code&gt; this will build your functions/, get them ready to handle requests and watch the filesystem for changes.&lt;/li&gt;
&lt;li&gt;In a third terminal window, curl your endpoint to test out the function (replace &lt;code&gt;&amp;lt;netlify-port&amp;gt;&lt;/code&gt; and &lt;code&gt;&amp;lt;playback-id&amp;gt;&lt;/code&gt; with your values.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -I 'http://localhost:&amp;lt;netlify-port&amp;gt;/.netlify/functions/sign_playback_id?playbackId=&amp;lt;playback-id&amp;gt;'
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;You should see a 302 (redirect) response with a &lt;code&gt;location&lt;/code&gt; header for the signed url.&lt;/p&gt;

&lt;p&gt;When you make any changes to your source files, &lt;code&gt;netlify-lambda serve&lt;/code&gt; will pick up on the changes and recompile the functions into &lt;code&gt;./.netlify/functions&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Deploy
&lt;/h2&gt;

&lt;p&gt;When you’re ready to deploy, you can deploy from the command line with &lt;code&gt;netlify-lambda build ./functions &amp;amp;&amp;amp; netlify deploy --prod&lt;/code&gt;. This will build the functions and then push up the changes to Netlify.&lt;/p&gt;

&lt;p&gt;Try making a POST request to your cloud function on Netlify:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -I 'https://&amp;lt;your-netlify-app&amp;gt;.netlify.com/.netlify/functions/sign_playback_id?playbackId=&amp;lt;playback-id&amp;gt;'
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Just like in dev, you should get back a 302 response with a &lt;code&gt;location&lt;/code&gt; header that points to the signed playback URL.&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://stream.mux.com/${playbackId}.m3u8?token=${token}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;This is what your response should look like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;HTTP/2 302
access-control-allow-origin: *
cache-control: no-cache
location: https://stream.mux.com/jqi1UtiO3gccQ019UcYjGJTLO9Ee00TLMY.m3u8?token=eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCIsImtpZCI6IkFsVFZncktBVTYzVldIdVplcDEwMVhZUk5mbHozeDIxRiJ9.eyJleHAiOjE1NzE3NjE0NzMsImF1ZCI6InYiLCJzdWIiOiJqcWkxVXRpTzNnY2NRMDE5VWNZakdKVExPOUVlMDBUTE1ZIn0.i7oANZ6inmwmGVQjon4WEv_gKcqQ2v8GuQA8xuCBdT0Reegkm6WyTdU-VloZvAt7duaRR3-T8dt147vUQjM1n70CLi0996pwMejYWIbRHUMqrDBtsENHG8T9jtz-EJcBGONSzgs7fBQIVQx8xJvPuX4YqpylDK_lNX0-RDqfhz5THAfuyxzePJod709msD8kbHAqnIke5lHzbQNHuO2ecNFVCb2ZozW7XkIEctyLxrDAK1ITtQV8iHek3whwO9S05kM-5bQzomJEliN3mXBqCwMBmyIp8l88YKl59tVXDdU-l-cZvZjt1GYKv0J7shO-oBYcr00NmVKkP7bie_w50w
date: Tue, 15 Oct 2019 16:24:33 GMT
age: 0
server: Netlify
x-nf-request-id: 80484951-e7ff-46f3-b78e-1349b8514bec-1426623
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Now, in your player, you can use your netlify function as the URL src. Here's an example for a web player (note that in order to get HLS in a &lt;code&gt;&amp;lt;video&amp;gt;&lt;/code&gt; tag to work outside of Safari you will need to use another library like [Video.js](&lt;a href="https://videojs.com"&gt;https://videojs.com&lt;/a&gt;" target="_blank" or &lt;a href="https://github.com/video-dev/hls.js/"&gt;HLS.js&lt;/a&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;video src="https://&amp;lt;your-netlify-project&amp;gt;.netlify.com/.netlify/functions/sign_playback_id?playbackId=&amp;lt;playback-id&amp;gt;"&amp;gt;&amp;lt;/video&amp;gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;And here's an example on iOS with AVPlayer in Swift:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let url = URL(string: "https://&amp;lt;your-netlify-project&amp;gt;.netlify.com/.netlify/functions/sign_playback_id?playbackId=&amp;lt;playback-id&amp;gt;")

player = AVPlayer(url: url!)
player!.play()
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;The player will load the netlify URL, get the 302 redirect to the signed Mux URL and load the HLS manifest from stream.mux.com.&lt;/p&gt;

&lt;h2&gt;
  
  
  Restrict who can access your function
&lt;/h2&gt;

&lt;p&gt;Now that your cloud function is working, you can add some security around it to make sure you only allow authorized users to generate signed urls.&lt;/p&gt;

&lt;p&gt;For web players you will want to change this line in the &lt;code&gt;sign_playback_id.js&lt;/code&gt; function:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;'Access-Control-Allow-Origin': '*',
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;You can use the Access-Control-Allow-Origin header to control the CORS rules for the resource.&lt;/p&gt;

</description>
      <category>serverless</category>
    </item>
    <item>
      <title>Phoenix LiveView: Build Twitch Without Writing JavaScript</title>
      <dc:creator>Dylan Jhaveri</dc:creator>
      <pubDate>Thu, 31 Oct 2019 20:21:14 +0000</pubDate>
      <link>https://dev.to/mux/phoenix-liveview-build-twitch-without-writing-javascript-436j</link>
      <guid>https://dev.to/mux/phoenix-liveview-build-twitch-without-writing-javascript-436j</guid>
      <description>&lt;p&gt;Phoenix LiveView is a new experiment that allows developers to build rich, real-time user experiences with server-rendered HTML. If you’re not familiar with Phoenix, it’s the fully-featured web framework for the Elixir programming language. At Mux we use Phoenix and Elixir to power our API. I decided to start playing around with LiveView to see what it’s capable of. The idea I had for an example app is Snitch, it’s like “Twitch,” but for snitches (put away your checkbooks potential investors). Under the hood, of course, we’re using &lt;a href="https://mux.com/live/"&gt;Mux Live Streaming&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;From the user’s perspective, first you create a “channel”. When that channel is created, Snitch will give you RTMP streaming credentials (just like Twitch does). As the user, you enter those streaming credentials into your mobile app or broadcast software and start streaming.&lt;/p&gt;

&lt;p&gt;Right here we have the perfect test case for LiveView. In the UI we show the user the streaming credentials and are now waiting for them to start streaming. Mux is going to &lt;a href="https://docs.mux.com/docs/webhooks#section-live-stream-events"&gt;send webhooks&lt;/a&gt; to our server when relevant events happen. For example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;code&gt;video.live_stream.connected&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;video.live_stream.recording&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;video.live_stream.active&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;video.live_stream.disconnected&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In a typical web application, without LiveView, common solutions are to either use websockets to push new data to client applications or have those applications poll the server. About every second or so the browser would send a request to the server to get the updated data. &lt;strong&gt;But now with LiveView when a webhook hits the server we can re-render on the server-side and push those changes to the client.&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Using LiveView to Handle Webhooks
&lt;/h2&gt;

&lt;p&gt;The first step is to follow the instructions on the Installation page to add LiveView to your Phoenix application. This includes adding the dependency, exposing the WebSocket route and the &lt;code&gt;phoenix_live_view&lt;/code&gt; JavaScript package for the client-side.&lt;/p&gt;

&lt;p&gt;After following the installation instructions, let’s add a route for the Mux Webhooks:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight elixir"&gt;&lt;code&gt;    &lt;span class="n"&gt;scope&lt;/span&gt; &lt;span class="s2"&gt;"/"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="no"&gt;SnitchWeb&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
      &lt;span class="n"&gt;pipe_through&lt;/span&gt; &lt;span class="ss"&gt;:api&lt;/span&gt;
      &lt;span class="n"&gt;post&lt;/span&gt; &lt;span class="s2"&gt;"/webhooks/mux"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="no"&gt;WebhookController&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;:mux&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Then, in the Mux UI we can add this as our webhooks route. For local development I’m using ngrok to receive webhooks on my localhost server.&lt;/p&gt;

&lt;p&gt;The webhook controller is going to receive the payload and update the “channel” in our database by calling &lt;code&gt;Snitch.Channels.update_channel&lt;/code&gt; . Let’s look at the &lt;code&gt;update_channel/2&lt;/code&gt; function:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight elixir"&gt;&lt;code&gt;&lt;span class="c1"&gt;# lib/snitch/channels.ex &lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="n"&gt;update_channel&lt;/span&gt;&lt;span class="p"&gt;(%&lt;/span&gt;&lt;span class="no"&gt;Channel&lt;/span&gt;&lt;span class="p"&gt;{}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;attrs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
  &lt;span class="n"&gt;channel&lt;/span&gt;
  &lt;span class="o"&gt;|&amp;gt;&lt;/span&gt; &lt;span class="no"&gt;Channel&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;changeset&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;attrs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="o"&gt;|&amp;gt;&lt;/span&gt; &lt;span class="no"&gt;Repo&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;update&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
  &lt;span class="o"&gt;|&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;notify_subs&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;&lt;code&gt;notify_subs/1&lt;/code&gt; is the new function we are going to call when a channel gets updated. This is where the LiveView magic happens.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight elixir"&gt;&lt;code&gt;&lt;span class="c1"&gt;# lib/snitch/channels.ex &lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="n"&gt;notify_subs&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="ss"&gt;:ok&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
  &lt;span class="no"&gt;Phoenix&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="no"&gt;PubSub&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;broadcast&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;Snitch&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="no"&gt;PubSub&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;"channel-updated:&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="ss"&gt;:ok&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;This function is going to broadcast a message so that subscribers can react to this change. More on that shortly.&lt;/p&gt;

&lt;p&gt;Now let’s update the controller and tell the controller to render with LiveView:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight elixir"&gt;&lt;code&gt;&lt;span class="c1"&gt;# lib/snitch_web/controllers/channel_controller.ex &lt;/span&gt;
&lt;span class="no"&gt;Phoenix&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="no"&gt;LiveView&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="no"&gt;Controller&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;live_render&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;conn&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="no"&gt;SnitchWeb&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="no"&gt;LiveChannelView&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;session:&lt;/span&gt; &lt;span class="p"&gt;%{&lt;/span&gt;&lt;span class="ss"&gt;channel:&lt;/span&gt; &lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;And let’s create &lt;code&gt;SnitchWeb.LiveChannelView&lt;/code&gt;. When we call &lt;code&gt;notify_subs()&lt;/code&gt; up above, this &lt;code&gt;LiveChannelView&lt;/code&gt; is the code that needs to subscribe and push an update to the client.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight elixir"&gt;&lt;code&gt;&lt;span class="k"&gt;defmodule&lt;/span&gt; &lt;span class="no"&gt;SnitchWeb&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="no"&gt;LiveChannelView&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
  &lt;span class="kn"&gt;use&lt;/span&gt; &lt;span class="no"&gt;Phoenix&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="no"&gt;LiveView&lt;/span&gt;

  &lt;span class="c1"&gt;#&lt;/span&gt;
  &lt;span class="c1"&gt;# When the controller calls live_render/3 this mount/2 function will get called&lt;/span&gt;
  &lt;span class="c1"&gt;# after the mount/2 function finishes then the render/1 function will get called&lt;/span&gt;
  &lt;span class="c1"&gt;# with the assigns&lt;/span&gt;
  &lt;span class="c1"&gt;#&lt;/span&gt;
  &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="n"&gt;mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;session&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;socket&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
    &lt;span class="n"&gt;channel&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;session&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="ss"&gt;:channel&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;connected?&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;socket&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="no"&gt;SnitchWeb&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="no"&gt;Endpoint&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;subscribe&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"channel-updated:&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="ss"&gt;:ok&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;set_assigns&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;socket&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;

  &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="n"&gt;render&lt;/span&gt;&lt;span class="p"&gt;(%{&lt;/span&gt;&lt;span class="ss"&gt;playback_url:&lt;/span&gt; &lt;span class="no"&gt;nil&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;assigns&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; 
    &lt;span class="k"&gt;do&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="no"&gt;SnitchWeb&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="no"&gt;ChannelView&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;render&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"show.html"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;assigns&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="n"&gt;render&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;assigns&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="no"&gt;SnitchWeb&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="no"&gt;ChannelView&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;render&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"show_active.html"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;assigns&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="c1"&gt;#&lt;/span&gt;
  &lt;span class="c1"&gt;# Since the mount/2 function called "subscribe" to with the identifier&lt;/span&gt;
  &lt;span class="c1"&gt;# "channel-updated:#{channel.id}" then anytime data is broadcast this&lt;/span&gt;
  &lt;span class="c1"&gt;# handle_info/2 function will run and we have the power to set new values&lt;/span&gt;
  &lt;span class="c1"&gt;# with set_assigns/2&lt;/span&gt;
  &lt;span class="c1"&gt;#&lt;/span&gt;
  &lt;span class="c1"&gt;# After we assign new values, the render/1 function will get called with the&lt;/span&gt;
  &lt;span class="c1"&gt;# new assigns&lt;/span&gt;
  &lt;span class="c1"&gt;#&lt;/span&gt;
  &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="n"&gt;handle_info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;socket&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="ss"&gt;:noreply&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;set_assigns&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;socket&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;

  &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="n"&gt;set_assigns&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;socket&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
    &lt;span class="n"&gt;playback_url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Snitch&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="no"&gt;Channels&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;playback_url_for_channel&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="n"&gt;socket&lt;/span&gt;
    &lt;span class="o"&gt;|&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;assign&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;name:&lt;/span&gt; &lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="o"&gt;|&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;assign&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;status:&lt;/span&gt; &lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;mux_resource&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"status"&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
    &lt;span class="o"&gt;|&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;assign&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;connected:&lt;/span&gt; &lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;mux_resource&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"connected"&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
    &lt;span class="o"&gt;|&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;assign&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;stream_key:&lt;/span&gt; &lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;stream_key&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="o"&gt;|&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;assign&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;playback_url:&lt;/span&gt; &lt;span class="n"&gt;playback_url&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;To summarize what’s happening above:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;live_render/3&lt;/code&gt; will invoke &lt;code&gt;mount/2&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;mount/2&lt;/code&gt; will subscribe using an identifier (&lt;code&gt;"channel-updated:#{channel.id}"&lt;/code&gt;) and set_assigns for the view&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;render/1&lt;/code&gt; will get called with the assigns&lt;/li&gt;
&lt;li&gt;anytime somewhere else in the app broadcasts to (&lt;code&gt;"channel-updated:#{channel.id}"&lt;/code&gt;), this view is going to call &lt;code&gt;handle_info/2&lt;/code&gt; and that gives us the opportunity to use &lt;code&gt;set_assigns&lt;/code&gt; again to update the assigns and re-render the template&lt;/li&gt;
&lt;li&gt;re-renders auto-magically get pushed to the client over a websocket and the client updates the dom&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The only difference in the &lt;code&gt;show&lt;/code&gt; and &lt;code&gt;show_active&lt;/code&gt; templates that we use in LiveChannelView is that instead of &lt;code&gt;eex&lt;/code&gt; extension we use the &lt;code&gt;leex&lt;/code&gt; extension which stands for live embedded elixir.&lt;/p&gt;

&lt;p&gt;Here is a webapp where this is currently deployed at &lt;a href="https://snitch.world"&gt;snitch.world&lt;/a&gt; The full code is up here &lt;a href="https://github.com/dylanjha/snitch"&gt;on github&lt;/a&gt;. You can clone it and run it yourself. You’ll also need to sign up for a free account with Mux to get an API key. Feel free to reach out if you have any questions!&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/3IQlN_Ax6mg"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

</description>
      <category>elixir</category>
    </item>
  </channel>
</rss>
