<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Red5</title>
    <description>The latest articles on DEV Community by Red5 (@red5).</description>
    <link>https://dev.to/red5</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/red5"/>
    <language>en</language>
    <item>
      <title>Consensus on a MOQ Media Layer Player Framework Should Speed Market Adoption</title>
      <dc:creator>Maria Artamonova</dc:creator>
      <pubDate>Fri, 01 May 2026 13:08:54 +0000</pubDate>
      <link>https://dev.to/red5/consensus-on-a-moq-media-layer-player-framework-should-speed-market-adoption-39m</link>
      <guid>https://dev.to/red5/consensus-on-a-moq-media-layer-player-framework-should-speed-market-adoption-39m</guid>
      <description>&lt;p&gt;&lt;strong&gt;Modular Red5 Template Is Open-Source Path to Multiple Streaming Formats&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Amid intensifying demand for the interoperable approach to next-generation streaming specified by the emerging MOQ standard there’s a lot riding on the market’s ability to determine how payloads will be formatted in the MOQ Media Layer for distribution and playback.&lt;/p&gt;

&lt;p&gt;Happily, we can report that Red5 has garnered significant support for a way forward that relies on a MOQ player template we’re calling Playa. As a freely available open-source solution, Playa has the flexibility to accommodate configurations of players for device playback of virtually any MOQ-compatible streaming format operating over the MOQ Media Layer, which in International Standards Organization (ISO) parlance is the application layer that rides on the transport layer in internet communications.&lt;/p&gt;

&lt;p&gt;This has been a top priority in our work as a founding member of the OpenMOQ Software Consortium, an ad hoc body separated from but closely aligned with the Internet Engineering Task Force (&lt;a href="https://datatracker.ietf.org/group/moq/about/" rel="noopener noreferrer"&gt;IETF&lt;/a&gt;), which is overseeing MOQ standardization. Now, with &lt;a href="https://github.com/red5pro/moq-playa" rel="noopener noreferrer"&gt;Playa&lt;/a&gt; emerging as the OpenMOQ Consortium’s leading candidate for formatting MOQ stream playback, the stage is set for widespread experimentation that will expedite commercial rollouts once the MOQ Transport (MOQT) standard is finalized later this year.&lt;/p&gt;

&lt;p&gt;There’s been a lot of confusion about what MOQ is owing to its origins as an acronym for Media over QUIC, which has alternative meanings related to emerging proprietary platforms that rely on QUIC transport and to use of the QUIC protocol as the default transport mode with version 3 of conventional Hypertext Transfer Protocol (HTTP/3) streaming. IETF hopes to distinguish what it’s doing by using MOQ as a free-standing label for a new platform, which, as explained in our &lt;a href="https://www.red5.net/blog/what-is-moq-media-over-quic/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;What is MOQ blog&lt;/a&gt;, avoids the complexity and error-prone request-response method used with the dominant HTTP streaming system while eliminating the set-up and other complications intrinsic to real-time streaming via &lt;a href="https://www.red5.net/blog/what-is-webrtc/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;WebRTC&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Moreover, the new standard provides a way to automatically tune end-to-end latencies to whatever levels meet user requirements, including real-time interactive streaming use cases that need support for latencies registering at or below 300ms. And, already, MOQ has been made compatible with leading Web browsers, currently including Google Chrome, Microsoft Edge, Mozilla Firefox and Apple Safari, which largely eliminates the need for client plug-in software. &lt;/p&gt;

&lt;p&gt;Red5 and the &lt;a href="https://www.red5.net/blog/red5-joined-openmoq/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;other OpenMOQ Consortium founders&lt;/a&gt;, including Akamai, CDN77, Cisco, Synamedia and YouTube, are among a growing host of streaming providers and users worldwide who are wholeheartedly behind MOQ. We all recognize that an interoperable, tunable approach to structuring a new streaming network foundation is the fastest path to unleashing the full power of market forces seeking to exploit real-time video streaming at mass scales on an as-needed basis.&lt;/p&gt;

&lt;p&gt;The industry is moving rapidly toward proving the point with preparations for MOQ underway on many fronts. We’ve officially launched our MOQ capabilities and are starting with a &lt;a href="https://www.red5.net/blog/join-red5-moq-beta/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;limited beta&lt;/a&gt; that is not publicly available yet. Access is granted on an individual basis to make sure each deployment is aligned with a specific use case. This initial phase is intentionally selective so we can fine-tune the solution around the scenarios that matter most right now, including real-time latency and delivering to geographically distributed audiences at one-to-many scale. More about this in &lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7455609456068825088/" rel="noopener noreferrer"&gt;Chris Allen’s LinkedIn post&lt;/a&gt;. This will lead to full commercial rollout by mid-summer.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.red5.net/contact/?text=Hi%2C%20I%E2%80%99d%20like%20to%20join%20your%20globally%20deployed%20MOQ%20network%20beta&amp;amp;utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;Join our MOQ Beta&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At the same time, we’re proceeding full steam ahead with serving real-time streaming customers with our global implementations of &lt;a href="https://www.red5.net/webrtc-server/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;WebRTC&lt;/a&gt; transport utilizing &lt;a href="https://www.red5.net/blog/xdn-architecture-traditional-cdns-need-not-apply/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;XDN Architecture&lt;/a&gt; in our Red5 Cloud service and in customer-mounted multi-cloud iterations supported by &lt;a href="https://www.red5.net/live-streaming-sdks/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;Red5 SDKs&lt;/a&gt;. How all this works in bringing a broad array of use cases to life with our portfolio of &lt;a href="https://www.red5.net/truetime/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;TrueTime™ application toolsets&lt;/a&gt; is well documented on our website.&lt;/p&gt;

&lt;p&gt;In the discussion that follows we explore where things stand with MOQ player development. Along with explaining Playa, we’ll look at other early initiatives and share thoughts about how the ones we’re familiar with might work with Playa. And we explain the steps Red5 has taken with our partners to enable early implementations of MOQ at global scales.&lt;/p&gt;

&lt;p&gt;The discussion concludes with consideration of what can be achieved with use of the standard based on current MOQ Transport specifications and whether some real-time use cases will be best left to continued reliance on WebRTC. As shall be seen, from where we sit today it looks like MOQ will play a major role in normalizing real-time streaming across the global consumer markets while leaving video calling along with many applications in the enterprise, &lt;a href="https://www.red5.net/solutions/drone-public-safety/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;public safety&lt;/a&gt;, &lt;a href="https://www.red5.net/solutions/air-gapped-live-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;military&lt;/a&gt; and other &lt;a href="https://www.red5.net/solutions/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;commercial domains&lt;/a&gt; to execution over WebRTC transport.&lt;/p&gt;

&lt;h2&gt;
  
  
  The State of Play in MOQ Evolution
&lt;/h2&gt;

&lt;p&gt;Standards, especially those like MOQ built on open-source &lt;a href="https://www.red5.net/video-streaming-technology/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;technology&lt;/a&gt;, ensure the interoperability among whole systems and their components that’s essential to expediting adoption of new approaches to operating over the internet. But, as a community-driven process involving an unlimited flow of tech contributions and opinions, standards-building typically takes a long time.&lt;/p&gt;

&lt;p&gt;That hasn’t been the case with MOQ, which has reached an advanced stage of development in a remarkably short time, going from initiation to near completion in less than four years. MOQ Transport, the foundation for the new platform, is going through final revisions with expectations that the standard will be finalized by mid-summer. &lt;/p&gt;

&lt;p&gt;As described in the &lt;a href="https://www.red5.net/blog/what-is-moq-media-over-quic/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;What is MOQ&lt;/a&gt; blog we referenced in the introduction, MOQ Transport determines how sessions linking end users to servers are set up and terminated and how binary-coded descriptions of payload segment parameters, destination IDs, time stamps and routing directions are conveyed in transport packet headers at the front end of the payload segments. The platform relies on a system of relay nodes that allow any given stream to be fanned out from a single source to any number of end users with minimal use of processing resources along the way, which greatly expands the volume of streams that can be handled by relay servers.&lt;/p&gt;

&lt;p&gt;Adding to the versatility, tunable latencies range from real-time at 200-400ms to what developers call “interactive live” at 2 seconds, which might be sufficient for some interactive applications, to “conservative live” maintaining persistent HD, 4K or, eventually, 8K quality at 5 seconds. Moreover, CDN operators can design their relay caches to support recording live content for short-term replay and catchup, and there’s also support for bringing long-term storage into play for sending live content to VOD archives and cloud DVR platforms. &lt;/p&gt;

&lt;p&gt;At the media layer there’s no limit to the variety of streaming formats that can be devised for MOQ instances insofar as the MOQ Media Layer is decoupled from the Transport Layer, which allows CDN operators to offer MOQ Transport as a service while freeing their customers to configure streaming formats for any use cases they want to support. This opens opportunities not only for streaming formats targeting mass market applications but for niche formats as well, including video-free versions such as might be needed for chat services, autonomous vehicle operations, industrial IoT or smart city management. &lt;/p&gt;

&lt;p&gt;In the case of mass market video streaming applications, MOQ Media Layer-compatible streaming formats will define how the streamed A/V and ancillary elements conveying captioning, personalized and commonly shared features and ads, and other applications are compressed, encrypted and packaged for playback by media players running on receiving devices. IETF is developing a two-pronged standard for one such streaming format, formerly encapsulated as a single format called the &lt;a href="https://www.ietf.org/archive/id/draft-law-moq-warpstreamingformat-00.html?utm%5Fsource=chatgpt.com" rel="noopener noreferrer"&gt;WARP Streaming Format&lt;/a&gt; but now divided to accommodate two versions for use with and without the package framing known as Common Media Application Format (CMAF).&lt;/p&gt;

&lt;p&gt;The different approaches to CMAF are the only thing that distinguishes what’s now known as MOQ Streaming Format (MSF) from the version dubbed &lt;a href="https://datatracker.ietf.org/doc/html/draft-ietf-moq-cmsf-00?utm%5Fsource=chatgpt.com" rel="noopener noreferrer"&gt;CMSF&lt;/a&gt;, where the “C” stands for &lt;a href="https://www.red5.net/blog/what-is-cmaf/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;CMAF&lt;/a&gt;. Both versions are meant to provide an IETF-standardized version of a MOQ streaming format that can support the preponderance of use cases that will be flowing over MOQ Transport.&lt;/p&gt;

&lt;p&gt;As described by the IETF’s MOQ Working Group, MSF targets real-time and interactive levels of live payloads as well as VOD content by using the IETF’s &lt;a href="https://datatracker.ietf.org/doc/html/draft-ietf-moq-loc-01" rel="noopener noreferrer"&gt;Low Overhead Media Container&lt;/a&gt; (LOC) protocol as a light-weight approach to stream-layer packaging that aligns with media formats using WebCodecs, which is a standard defining interfaces with encoders and decoders used in internet communications. CMSF adds CMAF as an optional alternative to relying on LOC.&lt;/p&gt;

&lt;p&gt;Both define how what’s known as a catalog communicates information describing publishers’ output, and it specifies: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;how content should be packaged, encrypted and signaled;&lt;/li&gt;
&lt;li&gt;the use of latencies and the level of prioritization accorded real-time transmissions;&lt;/li&gt;
&lt;li&gt;details pertaining to broadcast workflows and how they’re initiated and terminated, and&lt;/li&gt;
&lt;li&gt;the parameters directing devices’ execution of ABR profile switching.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;As to what other MOQ streaming formats might be in the offing, much will be revealed as participants in the development process introduce media players tailored to their constituents’ needs. The good news is that, at this point, Red5 and others can confidently implement infrastructure for testing MOQ applications at the transport level knowing that users will have access to MOQ media players that can execute playback of their payloads in an open-source, interoperable environment that maximizes their market reach.&lt;/p&gt;

&lt;h2&gt;
  
  
  Playa and MOQ Player Development
&lt;/h2&gt;

&lt;p&gt;This is where the efforts of the OpenMOQ Software Consortium are paying off with its goal of fostering collaboration on development and accelerated deployment of open-source solutions tied to MOQ Transport. Along with the founding members listed in the introduction, consortium membership has expanded to include Bitmovin, qualabs, Vindral, Wowza, &lt;a href="https://www.aau.at/en/" rel="noopener noreferrer"&gt;Austria’s University of Klagenfurt&lt;/a&gt;, and &lt;a href="https://www.ozyegin.edu.tr/" rel="noopener noreferrer"&gt;Özyeğin Üniversity&lt;/a&gt; in Istanbul. &lt;/p&gt;

&lt;p&gt;Majority consensus among these key players that our new Playa open-source software stack provides a flexible template for tailoring browser-supported players specific to various streaming formats signals there’s now a way forward to begin working with MOQ in the real world. &lt;/p&gt;

&lt;p&gt;Consortium support fuels our work with MOQ developers within and outside the consortium who appreciate what the functionalities embodied in Playa architecture mean to their own player development goals. &lt;/p&gt;

&lt;h3&gt;
  
  
  Six Known Options
&lt;/h3&gt;

&lt;p&gt;At present we’re aware of six initiatives that have produced or are close to completing MOQ players. Some are well known to us and others we’re still waiting to learn more about. As &lt;a href="https://www.red5.net/blog/6-moq-players-you-need-to-know-about/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;described in this blog&lt;/a&gt;, the list includes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://github.com/moq-dev/moq/tree/main/js" rel="noopener noreferrer"&gt;Moq-js&lt;/a&gt;, which is the player for &lt;a href="https://doc.moq.dev/concept/layer/moq-lite#compatibility" rel="noopener noreferrer"&gt;MOQ Lite&lt;/a&gt; – As its name implies, MOQ Lite is a subset of MOQ Transport developed by former Twitch and Discord engineer Luke Curley to accomplish MOQ calls with elimination of some requirements in the protocol stack without undermining basic functionalities supporting live multidirectional streaming. Most notably, it eliminates the MOQT Fetch process used for VOD and DVR scrubbing, falling back instead on HTTP streaming architecture for those applications. We’ve been working with Curley to ensure compatibility of the Moq-js player with Playa. It appears that CDN operators who want to take advantage of MOQ Lite and the full implementation of MOQT will need to dedicate resources specific to each.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/facebookexperimental/moq-encoder-player" rel="noopener noreferrer"&gt;Moq-encoder-player&lt;/a&gt; by Meta – This is another project Red5 has been working with, in this case to facilitate Meta’s development of a browser-supported player that’s devoted to implementing a live video and audio encoder to be used in creating and consuming MOQ streams. As currently constituted, the player is meant to provide a minimal platform to help with testing MOQ interoperability. But the project means Meta is likely to be putting its considerable clout behind MOQ.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://bitmovin.com/player-web-x/" rel="noopener noreferrer"&gt;Player Web X&lt;/a&gt; by Bitmovin – This is the player framework Bitmovin has created for developers to use as a way to build players with superior speed and performance efficiency in multiple streaming domains, including legacy HLS and Dash as well as MOQ. Bitmovin has announced Player Web X will be available for use with the MOQ relay system Cloudflare has implemented on its global CDN, presumably in compatibility with MSF and possibly other MOQ streaming formats as they emerge. While the OpenMOQ Consortium’s members, including Bitmovin, are committed to the open-source agenda pertaining to creating a streaming format running over MOQT, the mandate leaves room for use of proprietary players or other extensions that members bring to the table. At this point the long-standing Player Web X framework is not open-sourced, but it’s possible at least some aspects to the MOQ version could make use of the Playa template to streamline its interactions with MOQT.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/moqtail/moqtail" rel="noopener noreferrer"&gt;MOQtail&lt;/a&gt; by a team affiliated with consortium member Özyeğin University and led by Professor Ali C. Begen – Now on Draft 14, MOQtail, which works with CMSF, is one of the longest running MOQ player projects with a foundation in Apache 2.0 licensing. While it has not gained much traction with community adoption, it has benefitted from sponsorship provided by Akamai, AWS and Cisco. One of the player’s distinguishing characteristics is that it supports both MSF and a version of MSF known as CMSF, which utilizes the Common Media Application Format (CMAF). The IETF MOQ Working Group has dropped CMSF from its MSF specifications, but, as described below, we provide support for CMAF in Playa. It remains to be seen whether MOQtail ends up adopting the Playa framework.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/Eyevinn/warp-player" rel="noopener noreferrer"&gt;WARP Player&lt;/a&gt; by &lt;a href="https://www.eyevinntechnology.se/" rel="noopener noreferrer"&gt;Eyevinn Technology&lt;/a&gt; – This is a fairly new project initiated by Eyevinn, an M&amp;amp;E-focused video streaming technology consultancy and platform builder based in Stockholm. The player is designed to only work with CMAF-compatible streaming formats, including CMSF. We’ve not had any interactions with the Eyevinn development team to assess what the thinking there is about working within the Playa framework.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/shaka-project/shaka-player/commit/ef361ed03995b7591b4aa3210c4f9aed7e4fec67" rel="noopener noreferrer"&gt;Shaka Player&lt;/a&gt; – This is a general-purpose player initiative originally undertaken as an open-source project at Google centered on support for HLS and DASH in Web, Android and TV playback scenarios. With introduction of proprietary elements deemed beneficial to the player Google relinquished control to an independent community of engineers, who introduced support for MOQ in Q1 2026 in what is now known as the Shaka Project, which also includes the Shaka Packager and Streamer. The effort is aimed at ensuring cutting edge advancements like surround sound can be implemented with MOQ player support compatible with CMSF and licensed through Apache 2.0.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We also note there’s work being done within the consortium on Media Layer components targeted specifically to the contribution segment of MOQ Transport involving publishers who are feeding their content to multiple affiliates for distribution to their audiences. This is an area of development unrelated to the distribution leg and client players that involves consortium co-founder Synamedia. They’re introducing a unified playout platform capable of delivering publishers’ content via MOQ Transport to affiliates in whatever mode they’re using to reach end users, whether it’s via MOQ, HLS and DASH streams or traditional MPEG TV channels.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Playa Connection
&lt;/h3&gt;

&lt;p&gt;Now with &lt;a href="https://github.com/red5pro/moq-playa" rel="noopener noreferrer"&gt;Playa&lt;/a&gt; serving as the OpenMOQ Consortium-endorsed framework for MOQ player development all of these initiatives and any others that come along can benefit from a structure that modularly encompasses all the elements that might be needed to build a MOQ player. Developers can construct their pipelines using whatever components they need with a great deal of flexibility in the selection of external tools for execution of Playa-compatible functionalities.&lt;/p&gt;

&lt;p&gt;In other words, there’s something for everyone, including developers of MOQ streaming formats that no one in the community is yet aware of. Following is a brief summary of how Playa makes this possible, all in the context of conforming to MOQT specifications.&lt;/p&gt;

&lt;p&gt;Along with supporting modularly independent use of components, we adhered to design principles that include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Testability of processing and pipeline logic independent of platform-specific decoding and API rendering,&lt;/li&gt;
&lt;li&gt;Framework compatibility with independent UIs and state-based frameworks used in providing rendering targets,&lt;/li&gt;
&lt;li&gt;Extensibility through use of pluggable extension points for object transforms, recovery policies and handling application-specific events.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The Playa architecture defines functional MOQ player blocks related to: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Managing MOQ Transport with support for both QUIC and WebTransport as well as stream multiplexing,&lt;/li&gt;
&lt;li&gt;Session management,&lt;/li&gt;
&lt;li&gt;Catalog parsing and track enumeration,&lt;/li&gt;
&lt;li&gt;Managing packaging,&lt;/li&gt;
&lt;li&gt;Maintaining stable performance over the media pipeline through jitter buffering, gap detection, A/V synchronization and control over decoder states,&lt;/li&gt;
&lt;li&gt;Rendering video and audio with frame timing,&lt;/li&gt;
&lt;li&gt;Quality control with ABR track selection and switching,&lt;/li&gt;
&lt;li&gt;Recovery involving prescribed modes of error detection, escalation and reconnection.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Some other highlights include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Packaging – The player must support either LOC or CMAF but preferably both.&lt;/li&gt;
&lt;li&gt;Access authentication – This should follow principles defined for MOQ Relay Requirements with support for CAT-4-MOQ token usage and/or Privacy Pass Authentication recommended as well.&lt;/li&gt;
&lt;li&gt;Security – All connections must use TLS 1.3+ as required by QUIC, and there should be support for securing development modes, such as self-signed certificate hashes for WebTransport.&lt;/li&gt;
&lt;li&gt;MOQT version support – The player must support at least version 14 or 16 of the latest MOQT drafts with both recommended to ensure maximum interoperability during specification evolution.&lt;/li&gt;
&lt;li&gt;Observability – The player should expose operational metrics (time to first frame, stall count, latency, quality switches) and support event tracing per the MOQT qlog specification. Real-time delivery quality diagnostics (jitter, latency) are recommended for relay assessment and operational monitoring.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Red5’s Role in Enabling Use of MOQ over Partner CDNs
&lt;/h2&gt;

&lt;p&gt;As of Q2 2026, the time has come for the beta testing ahead of MOQ Transport standard approval that will allow early adopters to quickly implement full-scale commercial operations. We can assume that any tweaks that might arise with finalization of the standard can be accommodated without disrupting what we and others are putting in place now.&lt;/p&gt;

&lt;p&gt;As mentioned earlier, we’ve begun supplying global reach for MOQ operations over CacheFly’s CDN with an eye toward expanding the XDN-anchored MOQ CDN ecosystem over time. We’re employing XDN multiprotocol ingestion and &lt;a href="https://www.red5.net/blog/what-is-transcoding/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;transcoding&lt;/a&gt; technology to enable MOQ-packaged payloads delivered from sources over Web, &lt;a href="https://www.red5.net/srt-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;SRT&lt;/a&gt;, &lt;a href="https://www.red5.net/zixi-protocol/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;Zixi&lt;/a&gt;, &lt;a href="https://www.red5.net/rtmp-server/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;RTMP&lt;/a&gt;, &lt;a href="https://www.red5.net/rtsp-protocol/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;RTSP&lt;/a&gt; or MPEG-TS streams to be ingested onto CDNs with multiple bitrate profiles matched to adaptive bitrate (ABR) ladders used in conventional streaming. &lt;/p&gt;

&lt;p&gt;In CacheFly’s case the CDN is integrated with Red5-supplied MOQ relay nodes that enable multicasting of the payloads in real time to all session-targeted regions served by its network. Red5 Cloud MOQ customers’ end users will be served over access networks for playback by devices equipped with our MOQ player software. &lt;/p&gt;

&lt;p&gt;MOQ Transport and Media Layer integrations with XDN Architecture on these and future partner CDNs make it possible for customers experimenting with MOQ to achieve the full range of multidirectional real-time streaming capabilities we’ve long supported with our use of WebRTC. &lt;a href="https://www.red5.net/blog/keys-to-optimizing-end-to-end-latency-with-webrtc/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;As explained at length in this blog&lt;/a&gt; and elsewhere on our website, execution of WebRTC transport on XDN Architecture enables multidirectional streaming with end-to-end latencies registering at 250ms or less. &lt;/p&gt;

&lt;h2&gt;
  
  
  Red5’s Support for Optimal Hybrid Use of MOQT with WebRTC
&lt;/h2&gt;

&lt;p&gt;The need for such MOQ/WebRTC integrations reflects the fact that a fully standardized environment for working with MOQ remains a work in progress, including further clarification as to the range of use cases that MOQ will support. At this point, MOQ doesn’t support the echo and other noise cancellations essential to videoconferencing, and the mechanisms for bringing users’ video outputs into synchronized operation with the multicast MOQ streams have yet to be fully articulated in the MOQ Transport specifications. &lt;/p&gt;

&lt;p&gt;By applying Red5’s mixer and transcoding solutions across the MOQ and WebRTC stream flows while enabling fallback to &lt;a href="https://www.red5.net/hls-server/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;HLS&lt;/a&gt; when devices aren’t using browsers or plugins supporting the other protocols, XDN Architecture serves as the protocol-agnostic glue that maximizes streaming flexibility. Customers using CacheFly or any other CDNs we partner with can rely on MOQ instead of WebRTC for unidirectional real-time streaming at massive scales while seamlessly bringing WebRTC and our &lt;a href="https://www.red5.net/truetime/meetings/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;TrueTime Meetings™ &lt;/a&gt;platform into play in any given session to enable synchronized real-time video calling among subsets of users.&lt;/p&gt;

&lt;p&gt;This makes it easy, for example, to serve a mass audience with live-streamed sports payloads packaged in the MOQ Media Layer while adding seamlessly initiated and accessed watch party features supported by WebRTC. In fact, Red5 Cloud and Red5 Pro customers leveraging MOQ over these CDNs will have recourse to implementing any service strategy enabled by the wide range of real-time interactive streaming applications supported by all of our TrueTime Solutions™ and other innovations. &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The progress in MOQT standardization and support for payload configurations and playback over the MOQ Media Layer has set the stage for widescale testing in the runup to commercial rollouts. With no limit on the variety of streaming formats and players that can be developed using the framework embodied in the modularized Playa template, every segment of the vast video streaming ecosystem, from the mass consumer markets to the smallest enterprise and institutional niches, can now prepare to benefit from a standardized approach to next-generation streaming that removes the latency impediments of the past.&lt;/p&gt;

&lt;p&gt;But there’s no denying it will take a good amount of time before support for MOQ-based streaming is as readily available as today’s HTTP-based streaming infrastructure. And it remains to be seen how far MOQ can go toward supporting seamless instantiation of real-time interactive video communications with unidirectional streaming on par with what can be done with Red5’s TrueTime Meeting™ or any other multidirectional video implementation over WebRTC on XDN infrastructure.&lt;/p&gt;

&lt;p&gt;Our goal in throwing full support behind MOQ is to ensure that no customer has to await emergence of an optimal all-MOQ environment, if there ever is one, to achieve their goals with real-time streaming. Whatever the use case might be, the option to employ XDN Architecture through the Red5 Cloud service or use of Red5 SDKs at any scale is immediately at hand with the ability to launch testing with MOQ and eventually to transition to use of MOQ to whatever extent makes sense. &lt;/p&gt;

&lt;p&gt;For now, with our multiprotocol streaming support extending to distribution over HTTP based protocols like HLS, DASH, LL-HLS and LL-DASH &lt;a href="https://www.red5.net/case-studies/red5-cloud-and-cachefly-hls-streaming-solution-with-dvr-and-global-cdn-reach/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Consensus%20on%20a%20MOQ%20Media%20Layer%20Player%20Framework%20Should%20Speed%20Market%20Adoption" rel="noopener noreferrer"&gt;via our partnership with CacheFly&lt;/a&gt;, XDN Architecture provides the protocol-agnostic platform that can be used to eliminate the operational silos of the past. And once ubiquitous availability of MOQ streaming infrastructure with support for tunable latencies makes it possible to meet that goal, XDN Architecture will continue to provide the optimal operational environment for getting the most out of MOQ. &lt;/p&gt;

</description>
      <category>livestreaming</category>
      <category>software</category>
      <category>learning</category>
      <category>beginners</category>
    </item>
    <item>
      <title>How to Connect Your Drone or IP Camera Feeds to Red5</title>
      <dc:creator>Maria Artamonova</dc:creator>
      <pubDate>Wed, 29 Apr 2026 21:36:58 +0000</pubDate>
      <link>https://dev.to/red5/how-to-connect-your-drone-or-ip-camera-feeds-to-red5-1a7f</link>
      <guid>https://dev.to/red5/how-to-connect-your-drone-or-ip-camera-feeds-to-red5-1a7f</guid>
      <description>&lt;p&gt;Since it comes up so often during the course of my week, I want to share a few simple options with interested parties for publishing your streams to Red5. This covers all the &lt;a href="https://www.red5.net/blog/drone-live-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=How%20to%20Connect%20Your%20Drone%20or%20IP%20Camera%20Feeds%20to%20Red5" rel="noopener noreferrer"&gt;drones&lt;/a&gt; and &lt;a href="https://www.red5.net/blog/ip-camera-live-streaming-rtsp-to-webrtc?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=How%20to%20Connect%20Your%20Drone%20or%20IP%20Camera%20Feeds%20to%20Red5" rel="noopener noreferrer"&gt;IP cameras&lt;/a&gt; that I’ve had exposure to over the years that do not already have a means to egress via &lt;a href="https://www.red5.net/blog/what-is-rtmp-streaming-protocol/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=How%20to%20Connect%20Your%20Drone%20or%20IP%20Camera%20Feeds%20to%20Red5" rel="noopener noreferrer"&gt;RTMP&lt;/a&gt;. While Flash Player is “dead,” &lt;a href="https://www.red5.net/blog/what-is-rtmp-streaming-protocol/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=How%20to%20Connect%20Your%20Drone%20or%20IP%20Camera%20Feeds%20to%20Red5" rel="noopener noreferrer"&gt;RTMP as a protocol&lt;/a&gt; certainly is not.&lt;/p&gt;

&lt;h2&gt;
  
  
  Using GStreamer for Stream Processing
&lt;/h2&gt;

&lt;p&gt;First up is to download the &lt;a href="https://gstreamer.freedesktop.org/" rel="noopener noreferrer"&gt;GStreamer&lt;/a&gt; application, a powerful multimedia framework that can handle the conversion and streaming process. To learn more about it, read our previous blog on using &lt;a href="https://www.red5.net/blog/gstreamer-for-low-latency-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=How%20to%20Connect%20Your%20Drone%20or%20IP%20Camera%20Feeds%20to%20Red5" rel="noopener noreferrer"&gt;GStreamer for low-latency streaming&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Discovering Your Stream Format
&lt;/h2&gt;

&lt;p&gt;If you don’t know the format of the video and/or audio (if present) on your drone, the &lt;code&gt;discoverer&lt;/code&gt; application can be used to see what you’re working with:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;gst-discoverer-1.0 rtsp://10.0.0.10:554/stream
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace 10.0.0.10:554 in the RTSP stream URL with your drone or camera IP address and port.&lt;/p&gt;

&lt;h2&gt;
  
  
  Video-Only Streams
&lt;/h2&gt;

&lt;p&gt;When the source only provides video, this is the &lt;code&gt;pipeline&lt;/code&gt; you’d use (assuming &lt;a href="https://www.red5.net/h264/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=How%20to%20Connect%20Your%20Drone%20or%20IP%20Camera%20Feeds%20to%20Red5" rel="noopener noreferrer"&gt;H.264&lt;/a&gt;):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;gst-launch-1.0 rtspsrc &lt;span class="nv"&gt;location&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"rtsp://10.0.0.10:554/stream"&lt;/span&gt; &lt;span class="nv"&gt;latency&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;0 &lt;span class="nv"&gt;name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;rtsp &lt;span class="o"&gt;!&lt;/span&gt; rtph264depay &lt;span class="o"&gt;!&lt;/span&gt; h264parse &lt;span class="o"&gt;!&lt;/span&gt; video/x-h264 &lt;span class="o"&gt;!&lt;/span&gt; queue &lt;span class="o"&gt;!&lt;/span&gt; flvmux &lt;span class="nv"&gt;name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;mux &lt;span class="o"&gt;!&lt;/span&gt; rtmpsink &lt;span class="nv"&gt;location&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"rtmp://10.0.0.35:1935/live/drone1 live=1"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Audio-Only Streams
&lt;/h2&gt;

&lt;p&gt;If the source is just audio, use this pipeline assuming an AAC &lt;a href="https://www.red5.net/blog/what-is-a-codec/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=How%20to%20Connect%20Your%20Drone%20or%20IP%20Camera%20Feeds%20to%20Red5" rel="noopener noreferrer"&gt;codec&lt;/a&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;gst-launch-1.0 rtspsrc &lt;span class="nv"&gt;location&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"rtsp://10.0.0.221:554/stream"&lt;/span&gt; &lt;span class="nv"&gt;latency&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;0 &lt;span class="nv"&gt;name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;rtsp &lt;span class="o"&gt;!&lt;/span&gt; rtpmp4gdepay &lt;span class="o"&gt;!&lt;/span&gt; aacparse &lt;span class="o"&gt;!&lt;/span&gt; audio/mpeg &lt;span class="o"&gt;!&lt;/span&gt; queue &lt;span class="o"&gt;!&lt;/span&gt; flvmux &lt;span class="nv"&gt;name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;mux &lt;span class="o"&gt;!&lt;/span&gt; rtmpsink &lt;span class="nv"&gt;location&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"rtmp://10.0.0.35:1935/live/remotemic1 live=1"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Combined Audio and Video Streams
&lt;/h2&gt;

&lt;p&gt;Finally, for sources providing both audio and video:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;gst-launch-1.0 &lt;span class="nt"&gt;-v&lt;/span&gt; rtspsrc &lt;span class="nv"&gt;location&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"rtsp://10.0.0.10:554/stream"&lt;/span&gt; &lt;span class="nv"&gt;latency&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;0 &lt;span class="nv"&gt;name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;rtsp rtsp. &lt;span class="o"&gt;!&lt;/span&gt; rtph264depay &lt;span class="o"&gt;!&lt;/span&gt; h264parse &lt;span class="o"&gt;!&lt;/span&gt; video/x-h264 &lt;span class="o"&gt;!&lt;/span&gt; queue &lt;span class="o"&gt;!&lt;/span&gt; mux.video rtsp. &lt;span class="o"&gt;!&lt;/span&gt; rtpmp4gdepay &lt;span class="o"&gt;!&lt;/span&gt; aacparse &lt;span class="o"&gt;!&lt;/span&gt; audio/mpeg &lt;span class="o"&gt;!&lt;/span&gt; queue &lt;span class="o"&gt;!&lt;/span&gt; mux.audio flvmux &lt;span class="nv"&gt;name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;mux &lt;span class="o"&gt;!&lt;/span&gt; rtmpsink &lt;span class="nv"&gt;location&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"rtmp://10.0.0.35:1935/live/audvid1 live=1"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Alternative: Red5 Restream API
&lt;/h2&gt;

&lt;p&gt;This example demonstrates the simplest means to get your source devices providing egress via &lt;a href="https://www.red5.net/blog/4-reasons-rtsp-streaming-is-still-relevant/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=How%20to%20Connect%20Your%20Drone%20or%20IP%20Camera%20Feeds%20to%20Red5" rel="noopener noreferrer"&gt;RTSP&lt;/a&gt; into Red5 without &lt;a href="https://www.red5.net/blog/what-is-rtmp-streaming-protocol/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=How%20to%20Connect%20Your%20Drone%20or%20IP%20Camera%20Feeds%20to%20Red5" rel="noopener noreferrer"&gt;RTMP&lt;/a&gt;, using the &lt;a href="https://www.red5.net/docs/red5-pro/development/api/restreamer/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=How%20to%20Connect%20Your%20Drone%20or%20IP%20Camera%20Feeds%20to%20Red5" rel="noopener noreferrer"&gt;Red5 Restream API&lt;/a&gt; via curl:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="err"&gt;curl&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;-X&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;POST&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;http://&lt;/span&gt;&lt;span class="mf"&gt;10.0&lt;/span&gt;&lt;span class="err"&gt;.&lt;/span&gt;&lt;span class="mf"&gt;0.35&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt;&lt;span class="mi"&gt;5080&lt;/span&gt;&lt;span class="err"&gt;/live/restream&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;\&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="err"&gt;-H&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Content-Type: application/json"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;\&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="err"&gt;-d&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;'&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"guid"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;"live/audvid1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"context"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;"live"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;"audvid1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"level"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"parameters"&lt;/span&gt;&lt;span class="p"&gt;:{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;"ipcam"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;"create"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"remoteContextPath"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;"stream"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"remoteStreamName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"host"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;"10.0.0.10"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"port"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="mi"&gt;554&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="err"&gt;'&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Final Notes
&lt;/h2&gt;

&lt;p&gt;These examples use a non-routable network, and you can change the fields to fit your devices and environment. For more detailed information and advanced configurations, see &lt;a href="https://www.red5.net/docs/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=How%20to%20Connect%20Your%20Drone%20or%20IP%20Camera%20Feeds%20to%20Red5" rel="noopener noreferrer"&gt;our documentation&lt;/a&gt;. Whether you’re working with consumer drones, professional camera equipment, or IP security cameras, these methods should help you get your RTSP streams flowing into Red5 for further distribution and processing.&lt;a href="https://medium.com/write?source=promotion%5Fparagraph---post%5Fbody%5Fbanner%5Fjsw%5Fblocks--0f3ff64c369d---------------------------------------" rel="noopener noreferrer"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>livestreaming</category>
      <category>software</category>
      <category>learning</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Join the Red5 MOQ Beta to Test Streaming with Your Application</title>
      <dc:creator>Maria Artamonova</dc:creator>
      <pubDate>Mon, 20 Apr 2026 15:08:23 +0000</pubDate>
      <link>https://dev.to/red5/join-the-red5-moq-beta-to-test-streaming-with-your-application-4bl7</link>
      <guid>https://dev.to/red5/join-the-red5-moq-beta-to-test-streaming-with-your-application-4bl7</guid>
      <description>&lt;p&gt;The &lt;a href="https://www.red5.net/media-over-quic-moq/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Join%20the%20Red5%20MOQ%20Beta%20to%20Test%20Streaming%20with%20Your%20Application" rel="noopener noreferrer"&gt;Red5 MOQ beta&lt;/a&gt; gives you early access to one of the first production-ready Media over QUIC streaming solutions at global scale. In this post, we explain what the beta includes, how it works, and how you can get access. You will also learn what to expect during onboarding and how to start testing with your own application.&lt;/p&gt;

&lt;h2&gt;
  
  
  About Red5 MOQ beta
&lt;/h2&gt;

&lt;p&gt;Red5 Cloud MOQ beta is one of the first end-to-end production-ready MOQ streaming solutions that operates at global scale. In addition to supporting multiple ingest protocols such as WHIP, SRT, RTMP, and Zixi, delivering across &lt;a href="https://www.cachefly.com/" rel="noopener noreferrer"&gt;CacheFly&lt;/a&gt;’s global network, and providing our MOQ-based player, it makes it easy to deploy real-time streaming without managing infrastructure, scaling, monitoring, or performance optimization. This solution gives organizations and businesses a complete, flexible, fully managed architecture they can use today without being locked into a single protocol.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F50au0x9afyngk85vnvbl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F50au0x9afyngk85vnvbl.png" alt="Red5 MOQ beta with CacheFly CDN deployment diagram" width="800" height="417"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Red5 MOQ beta with CacheFly CDN deployment diagram.&lt;/p&gt;

&lt;p&gt;If you are attending the NAB Show, you can &lt;a href="https://www.red5.net/blog/meet-red5-at-nab-show-2026/#become-a-beta-tester-of-moq-streaming-powered-by-red5-and-cachefly?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Join%20the%20Red5%20MOQ%20Beta%20to%20Test%20Streaming%20with%20Your%20Application" rel="noopener noreferrer"&gt;visit the CacheFly booth&lt;/a&gt; &lt;strong&gt;W3129&lt;/strong&gt; to see how this works in practice and speak with our team in person.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Get Beta Access
&lt;/h2&gt;

&lt;p&gt;Since the Red5 MOQ beta is not publicly available yet, we grant access individually to make sure each setup is aligned with your use case.&lt;/p&gt;

&lt;p&gt;Here’s how to get started:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Reach out to us using &lt;a href="https://www.red5.net/contact/?text=Hi%2C%20I%E2%80%99d%20like%20to%20join%20your%20globally%20deployed%20MOQ%20network%20beta&amp;amp;utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Join%20the%20Red5%20MOQ%20Beta%20to%20Test%20Streaming%20with%20Your%20Application" rel="noopener noreferrer"&gt;this link&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Meet with our sales team to discuss your use case and confirm eligibility for the beta. Access is limited to selected teams with large-scale use cases (not solo practitioners). The beta runs for 30 days by default, with extensions available if your testing requires more time.&lt;/li&gt;
&lt;li&gt;Meet with our technical team to configure your beta access and get guidance on how to get started.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This process ensures you can test the Red5 MOQ beta in a way that reflects your real-world requirements and get the most value from early access.&lt;/p&gt;

</description>
      <category>livestreaming</category>
      <category>mediaoverquic</category>
      <category>news</category>
      <category>softwaredevelopment</category>
    </item>
    <item>
      <title>Live Streaming From Space: The Infrastructure Challenges Behind Live Video Beyond Earth</title>
      <dc:creator>Maria Artamonova</dc:creator>
      <pubDate>Mon, 13 Apr 2026 18:28:31 +0000</pubDate>
      <link>https://dev.to/red5/live-streaming-from-space-the-infrastructure-challenges-behind-live-video-beyond-earth-3oj7</link>
      <guid>https://dev.to/red5/live-streaming-from-space-the-infrastructure-challenges-behind-live-video-beyond-earth-3oj7</guid>
      <description>&lt;p&gt;Space, the final frontier in live video streaming. Today I want to discuss what it takes to deliver reliable live streaming from space, from early orbital broadcasts to upcoming lunar missions and beyond. We will break down the technical, operational, and viewer experience challenges behind delivering a live feed from space at scale.&lt;/p&gt;

&lt;h2&gt;
  
  
  From the Moon to Millions: NASA’s Streaming Vision
&lt;/h2&gt;

&lt;p&gt;Back in December I had the privilege of attending a super cool presentation at the &lt;a href="https://events.sportsvideo.org/2025-svg-summit/" rel="noopener noreferrer"&gt;SVG Summit 2025&lt;/a&gt;, Live Streaming from the Moon: From Sports to Space with NASA+ with &lt;a href="https://www.linkedin.com/in/leeerickson/" rel="noopener noreferrer"&gt;Lee Erikson&lt;/a&gt; and &lt;a href="https://www.linkedin.com/in/rebecca-sirmons-3670591a/" rel="noopener noreferrer"&gt;Rebecca Sirmons&lt;/a&gt;. What they covered sounded a bit like sci-fi fiction, but they made it clear plans are in the works for a massive live streaming event from space. &lt;/p&gt;

&lt;p&gt;From the talk I learned that &lt;a href="https://www.nasa.gov/" rel="noopener noreferrer"&gt;NASA&lt;/a&gt; is opening up their live feed for anyone and everyone who wants to use it to create their own live viewing experiences. I think the possibilities of this are super exciting, meaning we can create some really unique experiences with their live content. Imagine synchronized viewing rooms where millions of people watch a lunar landing together with real-time telemetry overlays, mission data, and social interaction aligned to the exact video moment. &lt;/p&gt;

&lt;p&gt;You might be wondering why NASA was presenting at a Sports Video conference, since obviously space travel isn’t a sport. Rebecca and Lee made it clear that the problems they face in broadcasting their live event has many of the same challenges that sports broadcasters do. Therefore they came to the conference to get our (us in the live sports business) feedback. &lt;/p&gt;

&lt;h2&gt;
  
  
  Artemis II Mission: What Viewers Expect vs Reality
&lt;/h2&gt;

&lt;p&gt;Artemis II marked NASA’s next major step toward returning humans to deep space, with a crewed mission that served as a dress rehearsal before future lunar landings. The mission included a full launch sequence, rollout of the rocket, a multi-day journey around the Moon, and a safe return to Earth, validating systems that will support long-duration human spaceflight beyond low Earth orbit.&lt;/p&gt;

&lt;p&gt;From a viewer experience perspective, the video delivery can be divided into three stages: the launch phase with the crew departing Earth and reaching orbit, the live video from space during the translunar flight, and the return to Earth with reentry and splashdown. The Artemis II launch was scheduled for April 1, 2026, at 6:24 PM ET and could be viewed as a live feed on NASA+, the NASA YouTube channel, and via the NASA App. Coverage was also available on TV through major networks like CBS and CNN, and via streaming platforms such as Amazon Prime, Roku, and FOX Weather. A recording of the broadcast is also available for replay.&lt;/p&gt;

&lt;p&gt;The expectations for the launch broadcast were massive, especially given how modern audiences consume live video, and how commercial space companies have pushed expectations higher. Audiences now expect cinematic quality from launches because companies like SpaceX normalized multi-camera live production with real-time telemetry overlays.&lt;/p&gt;

&lt;p&gt;Looking at viewer feedback from Reddit and engineering communities criticized Artemis coverage for inconsistent production quality and long stretches of filler content:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Viewers were disappointed with the production quality during both the launch and reentry phases. Feedback highlighted issues such as &lt;a href="https://www.reddit.com/r/SpaceXMasterrace/comments/1siijyk/i%5Fhave%5Fno%5Fidea%5Fhow%5Fthis%5Fhappened%5Fbut%5Fwoah%5Fwhat%5Fa/" rel="noopener noreferrer"&gt;manual camera tracking of the rocket&lt;/a&gt; that appeared slow and often out of focus, &lt;a href="https://www.reddit.com/r/SpaceXMasterrace/comments/1sa05z3/nasa%5Fbroadcast%5Fcoverage%5Fnexttonone/" rel="noopener noreferrer"&gt;excessive cuts to crowd reactions&lt;/a&gt; instead of showing the mission itself, &lt;a href="https://www.reddit.com/r/ArtemisProgram/comments/1sa1ez6/that%5Fwas%5Fa%5Fgreat%5Flaunch%5Fbut%5Fnasas%5Flaunch%5Fwebcast/" rel="noopener noreferrer"&gt;missed key moments&lt;/a&gt; like SRB separation, limited onboard camera footage, and low-quality visuals including oversaturated Orion camera feeds and lagging CG representations.&lt;/li&gt;
&lt;li&gt;Multiple users pointed out that &lt;a href="https://www.reddit.com/r/VIDEOENGINEERING/comments/1sa2zrw/yikes%5Fthe%5Fnasa%5Fartemis%5Fcoverage%5Fwas%5Fpretty%5Fbad/" rel="noopener noreferrer"&gt;feeds dropped or went black at crucial moments&lt;/a&gt;, including right before and during liftoff. This created confusion about whether issues were technical failures or production errors.&lt;/li&gt;
&lt;li&gt;Many criticized the broadcast pacing, especially during the countdown and pre-launch segments where engagement dropped due to lack of meaningful visuals or data overlays.&lt;/li&gt;
&lt;li&gt;Viewers were frustrated with delays between events and what was shown in the live stream. Several users pointed out noticeable lag in the live feed compared to real-time expectations.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Overall, the expectation was clear. Audiences wanted a true live experience with synchronized data, minimal delay, and a compelling broadcast that felt modern. Today, people expect high-quality video similar to what they see in video on demand replays of live broadcasts, even when the live video is coming from space. This has become the &lt;a href="https://www.red5.net/blog/streaming-at-the-speed-of-thought/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Live%20Streaming%20From%20Space%3A%20The%20Infrastructure%20Challenges%20Behind%20Live%20Video%20Beyond%20Earth" rel="noopener noreferrer"&gt;standard expectation for all live content&lt;/a&gt;, regardless of where it originates. However, live streaming, especially from space, is fundamentally different from streaming on Earth. It introduces a set of unique challenges that I will explain next.&lt;/p&gt;

&lt;h2&gt;
  
  
  Engineering Reality: Streaming Beyond Earth
&lt;/h2&gt;

&lt;p&gt;Historically, live video from orbit has already proven technically feasible. Systems like the &lt;a href="https://eol.jsc.nasa.gov/esrs/hdev/" rel="noopener noreferrer"&gt;ISS High Definition Earth Viewing cameras&lt;/a&gt; streamed live footage from space using commercial camera hardware, showing that consumer-grade technology can operate in orbit with proper engineering. &lt;/p&gt;

&lt;p&gt;SpaceX regularly live streams from low orbit in their rocket launches with multi-camera setups, onboard live video feeds, and real-time telemetry overlays that deliver a polished broadcast experience to viewers on Earth.&lt;/p&gt;

&lt;p&gt;However, future lunar missions introduce a different scale of challenge compared to low Earth orbit. &lt;a href="https://www.red5.net/blog/what-causes-video-streaming-delay-and-how-to-fix-it/#2-network-routing?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Live%20Streaming%20From%20Space%3A%20The%20Infrastructure%20Challenges%20Behind%20Live%20Video%20Beyond%20Earth" rel="noopener noreferrer"&gt;Latency increases dramatically due to distance&lt;/a&gt;, transmission windows are constrained, and relay satellites become part of the architecture. Bandwidth is limited because astronaut safety data always has priority. Hardware has to survive radiation and extreme environments. Certification cycles can take years, so systems often launch with technology that is already a decade old. That changes assumptions about synchronization and interactivity.&lt;/p&gt;

&lt;p&gt;One of the biggest architectural differences compared to terrestrial streaming is that space video systems must operate in intermittently connected environments. Unlike Earth networks where persistent connectivity is assumed, spacecraft often rely on scheduled transmission windows through relay satellites. That means buffering strategies, forward error correction, and delay-tolerant networking concepts become part of the video delivery stack. In many cases, reliability matters more than immediacy, which forces engineers to rethink traditional assumptions about latency optimization.&lt;/p&gt;

&lt;p&gt;To overcome those limitations, the industry is starting to explore entirely different transmission technologies. Another emerging factor is optical communications. Space agencies are investing heavily in laser-based transmission systems to increase bandwidth between spacecraft and Earth. We at Red5 hold two &lt;a href="https://www.red5.net/patents/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Live%20Streaming%20From%20Space%3A%20The%20Infrastructure%20Challenges%20Behind%20Live%20Video%20Beyond%20Earth" rel="noopener noreferrer"&gt;patents&lt;/a&gt; related to extraterrestrial streaming. The core idea involves transmitting video over long-distance optical links such as line-of-sight laser communication instead of traditional radio frequencies. This approach can significantly increase bandwidth efficiency by a huge amount while reducing interference, which becomes critical when you are dealing with massive distances and constrained transmission windows.&lt;/p&gt;

&lt;p&gt;Another interesting challenge is compression efficiency. When line of site laser transmission is blocked and bandwidth is scarce, or other circumstances cause transmission to be limited, every bit matters. Advances in codecs, adaptive bitrate strategies, and edge processing will play a major role in making high-quality video feasible beyond Earth orbit. There is also growing interest in performing AI-assisted processing at the edge, for example prioritizing regions of interest or dynamically adjusting quality based on mission context before transmission.&lt;/p&gt;

&lt;p&gt;To make matters even more difficult is the speed of light limitations and transmitting at tremendous distances. A transmission from Mars for example takes on average around 40 seconds, so real-time communication isn’t possible. This exact scenario was the fodder for last year’s &lt;a href="https://www.linkedin.com/posts/thechrisallen%5Fintroducing-red5-quantumstream-streaming-activity-7312851407974326273-v8sU/" rel="noopener noreferrer"&gt;April Fool’s joke&lt;/a&gt;, where we claimed to create negative latency streaming that indeed would have made real-time communication to and from Mars possible. You’d be surprised at how many people actually wanted access to that beta. &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Live streaming from space has been technically feasible for years, but Artemis II showed how today’s NASA live streaming infrastructure actually performs at scale. While the experience did not always meet viewer expectations due to delays and production limitations, it is important to recognize that streaming from space operates under fundamentally different constraints than terrestrial live streaming.&lt;/p&gt;

&lt;p&gt;At the same time, many aspects of launch broadcast production from Earth can already be improved using modern tooling. &lt;a href="https://www.red5.net/blog/ai-in-live-streaming/#what-use-cases-can-benefit-from-using-ai-powered-capabilities-in-live-streaming?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Live%20Streaming%20From%20Space%3A%20The%20Infrastructure%20Challenges%20Behind%20Live%20Video%20Beyond%20Earth" rel="noopener noreferrer"&gt;AI-powered features&lt;/a&gt; such as automated object tracking for rocket launches, real-time transcription and translation, and moderation of user-generated content based on predefined rules can significantly enhance the viewing experience. Engagement can also be improved with chat overlays powered by &lt;a href="https://www.red5.net/blog/red5-cloud-integrates-pubnub-to-deliver-interactivity-intelligence-global-scalability-for-real-time-streaming/#what-this-integration-makes-possible?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Live%20Streaming%20From%20Space%3A%20The%20Infrastructure%20Challenges%20Behind%20Live%20Video%20Beyond%20Earth" rel="noopener noreferrer"&gt;real-time data streaming&lt;/a&gt;, as well as tighter synchronization of telemetry data, rocket trajectory, and weather conditions with live video.&lt;/p&gt;

&lt;p&gt;As extraterrestrial streaming evolves, combining these production advancements with space-grade infrastructure will help close the gap between what is technically possible and what audiences expect from a modern live broadcast from space.&lt;/p&gt;

</description>
      <category>livestreaming</category>
      <category>software</category>
      <category>learning</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Ad Insertion for MOQ (Media over QUIC): What Is Possible Today and What Comes Next</title>
      <dc:creator>Maria Artamonova</dc:creator>
      <pubDate>Thu, 02 Apr 2026 13:21:26 +0000</pubDate>
      <link>https://dev.to/red5/ad-insertion-for-moq-media-over-quic-what-is-possible-today-and-what-comes-next-102j</link>
      <guid>https://dev.to/red5/ad-insertion-for-moq-media-over-quic-what-is-possible-today-and-what-comes-next-102j</guid>
      <description>&lt;p&gt;To continue my coverage of all things &lt;a href="https://www.red5.net/media-over-quic-moq/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;MOQ&lt;/a&gt;, today I want to touch on something that keeps coming up in MOQ discussions: ad insertion.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.red5.net/blog/what-is-ultra-low-latency-why-does-it-matter/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;Ultra-low latency&lt;/a&gt; delivery is only part of the equation. For MOQ to be viable in real production environments, monetization has to work too. That means clean ad signaling, measurable delivery, and architectures that scale without breaking synchronization.&lt;/p&gt;

&lt;h2&gt;
  
  
  IETF MOQ interim at Google’s Boulder campus
&lt;/h2&gt;

&lt;p&gt;This was one of the topics at the recent IETF MOQ interim at Google’s Boulder campus in February 2026. Watch the video below, where I discuss this event with my teammate &lt;a href="https://www.red5.net/author/paul-gregoire/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;Paul Gregoire&lt;/a&gt;, Red5 Solutions Architect, who attended it. &lt;/p&gt;

&lt;p&gt;To better understand how these concepts apply in practice, here are the key definitions used in ad insertion workflows for MOQ:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Server-Side Ad Insertion&lt;/strong&gt; (SSAI): ads are stitched directly into the video stream on the server before the stream is delivered to viewers.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Server-Guided Ad Insertion&lt;/strong&gt; (SGAI): the server signals ad opportunities and decides when and which ads to request and play.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Client-Side Ad Insertion&lt;/strong&gt; (CSAI): the video player on the viewer’s device requests, loads, and inserts ads during playback instead of receiving a prestitched stream.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Regional blackout&lt;/strong&gt;: a restriction that blocks or replaces live content for viewers in certain geographic areas due to licensing or local broadcast rights.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://www.linkedin.com/in/gwendalsimon/" rel="noopener noreferrer"&gt;Gwendal Simon&lt;/a&gt; from &lt;a href="https://www.synamedia.com/" rel="noopener noreferrer"&gt;Synamedia&lt;/a&gt; and &lt;a href="https://www.linkedin.com/in/wilaw/" rel="noopener noreferrer"&gt;Will Law&lt;/a&gt; from &lt;a href="https://www.akamai.com/" rel="noopener noreferrer"&gt;Akamai Technologies&lt;/a&gt; presented how MOQ and the MOQ Transport Streaming Format (MSF) can carry SCTE-35 signaling for Server-Guided Ad Insertion (SGAI) as well as other control scenarios such as regional blackout enforcement.&lt;/p&gt;

&lt;p&gt;Figuring out how content works with advertising and blackout use cases in a MOQ environment is critical for the technology to support real-world media and entertainment applications. Because of that, the topic is receiving significant attention across the MOQ community. The problem is not fully solved yet, but progress is moving quickly as new approaches and demonstrations continue to emerge.&lt;/p&gt;

&lt;p&gt;Gwendal and Will presented what is possible today: a working architecture where ad decisioning systems publish SCTE-35 signaling as structured events on a dedicated Event Timeline track, while media and advertising streams remain separate and independently distributed over MOQ. Subscribers can follow these signals in real-time to switch between media and ad tracks, fetch ad content dynamically using identifiers such as MOQ URLs, and return to the primary program stream at the correct playback moment. &lt;/p&gt;

&lt;p&gt;Learn more from the files they presented at the event.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.red5.net/wp-content/uploads/2026/04/Ad-Insertion-in-MOQ-Interim-Meeting-Boulder.pdf?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;Ad Insertion in MOQ — Interim Meeting Boulder&lt;/a&gt;&lt;a href="https://www.red5.net/wp-content/uploads/2026/04/Ad-Insertion-in-MOQ-Interim-Meeting-Boulder.pdf?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;Download&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.red5.net/wp-content/uploads/2026/04/SGAI-Over-MOQ%5F-SCTE35-Based-Event-Timeline-Type-Definition.pdf?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;SGAI Over MOQ_ SCTE35-Based Event Timeline Type Definition&lt;/a&gt;&lt;a href="https://www.red5.net/wp-content/uploads/2026/04/SGAI-Over-MOQ%5F-SCTE35-Based-Event-Timeline-Type-Definition.pdf?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;Download&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The approach demonstrates how existing broadcast signaling models can operate over MOQ without embedding cues directly inside the video stream, allowing signaling and media delivery to scale independently while preserving the timing relationships needed for live playback. &lt;/p&gt;

&lt;h2&gt;
  
  
  Does Red5 Support Ad Insertion for MOQ?
&lt;/h2&gt;

&lt;p&gt;We are actively working on this area with partners like &lt;a href="https://www.google.com/aclk?sa=L&amp;amp;pf=1&amp;amp;ai=DChsSEwit6qTSzM6TAxW2yHkEHdpcJ80YACICCAEQABoCd2Y&amp;amp;co=1&amp;amp;ase=2&amp;amp;gclid=Cj0KCQjwp7jOBhDGARIsABe7C4dwBA-WQb-cUPyVfTrFCkCHvZu1sf56nWfg3SjMwLrZWcMuKgxdz7waAjAOEALw%5FwcB&amp;amp;cid=CAASWuRoKNZ4Kktya8p1ipa6Xwwb7At7elKahQ6-Ow3ueOwqMemt4y%5FLlLyvJCbdETdnSow74eUpYN8FompKre7tieSmOfIIQc9kmlfIB6VlrOCpgXxNhMdKuNBRlw&amp;amp;cce=2&amp;amp;category=acrcp%5Fv1%5F32&amp;amp;sig=AOD64%5F3U-Sercsd5t1w9ulUcIaAzniKTqg&amp;amp;q&amp;amp;nis=4&amp;amp;adurl=https://showfer.com?gad%5Fsource%3D1%26gad%5Fcampaignid%3D23321922150%26gbraid%3D0AAAAACzRizUJqa2hcsk0UWdZApXQ63rbi%26gclid%3DCj0KCQjwp7jOBhDGARIsABe7C4dwBA-WQb-cUPyVfTrFCkCHvZu1sf56nWfg3SjMwLrZWcMuKgxdz7waAjAOEALw%5FwcB&amp;amp;ved=2ahUKEwi46J3SzM6TAxWrgP0HHR-XCiQQ0Qx6BAgMEAE" rel="noopener noreferrer"&gt;Showfer Media&lt;/a&gt;, integrating ad workflows into real-time streaming pipelines so MOQ can move from experimental to production-ready systems. If you are interested in learning more, visit us at the &lt;a href="https://www.red5.net/blog/meet-red5-at-nab-show-2026/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;NAB Show 2026&lt;/a&gt;, where we will showcase MOQ and &lt;a href="https://www.red5.net/truetime/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;TrueTime Solutions™&lt;/a&gt; in action at the Nomad Media booth W2357 and AWS booth W1701. Reach out to us &lt;a href="https://www.red5.net/contact/?text=Hi,%20I%20would%20like%20to%20meet%20you%20at%20the%20NAB%20Show%202026&amp;amp;utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;here&lt;/a&gt; to schedule a meeting at NAB. Our schedule is filling up quickly, so it is best to plan ahead.&lt;/p&gt;

&lt;h2&gt;
  
  
  Become a Beta Tester of MOQ Streaming Powered by Red5 and CacheFly
&lt;/h2&gt;

&lt;p&gt;Beyond the live demos, we will introduce how &lt;a href="https://www.red5.net/red5-cloud-low-latency-live-streaming-platform/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;Red5 Cloud&lt;/a&gt; will deliver MOQ at scale through our partnership with &lt;a href="https://www.red5.net/partners/cachefly/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;CacheFly&lt;/a&gt;. In this workflow, live streams are ingested into Red5, processed through our video packaging layer, and delivered via CacheFly using either MOQ or HTTP-based protocols like &lt;a href="https://www.red5.net/hls-server/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;HLS&lt;/a&gt; and LL-HLS. This gives customers flexibility. You can deliver ultra-low latency streams with MOQ where real-time performance matters most, while still supporting traditional formats for broader device compatibility. It is not about replacing one protocol with another. It is about choosing what works best for your use case.&lt;/p&gt;

&lt;p&gt;Our teams will be collecting beta testers at NAB for our globally deployed MOQ network. It allows Red5 Cloud users to leverage CacheFly CDN for MOQ delivery. If you want in on this early or just want some more details on how it might work for your business, reach out to us using &lt;a href="https://www.red5.net/contact/?text=Hi%2C%20I%E2%80%99d%20like%20to%20join%20your%20globally%20deployed%20MOQ%20network%20beta&amp;amp;utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;this link&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Ad insertion for MOQ is evolving quickly as the industry works to make real-time streaming monetizable without sacrificing synchronization or scale. While approaches like SCTE-35–based signaling and SGAI over MOQ already show strong potential, the ecosystem is still maturing as partners continue building production-ready workflows. With ongoing collaboration and real-world testing, MOQ is moving closer to supporting reliable, scalable ad-supported streaming. For a deeper look at how MOQ compares to existing transport approaches, read our “&lt;a href="https://www.red5.net/blog/srt-vs-moqt/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Ad%20Insertion%20for%20MOQ%20(Media%20over%20QUIC)%3A%20What%20Is%20Possible%20Today%20and%20What%20Comes%20Next" rel="noopener noreferrer"&gt;SRT vs MOQT&lt;/a&gt;” blog.&lt;/p&gt;

</description>
      <category>livestreaming</category>
      <category>software</category>
      <category>learning</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Red5 TrueTime Meetings™ for News: From the Field to Broadcast in Real-Time</title>
      <dc:creator>Maria Artamonova</dc:creator>
      <pubDate>Tue, 24 Mar 2026 13:40:29 +0000</pubDate>
      <link>https://dev.to/red5/red5-truetime-meetings-for-news-from-the-field-to-broadcast-in-real-time-136n</link>
      <guid>https://dev.to/red5/red5-truetime-meetings-for-news-from-the-field-to-broadcast-in-real-time-136n</guid>
      <description>&lt;p&gt;Today I want to talk about how &lt;a href="https://www.red5.net/solutions/broadcast-news/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Red5%20TrueTime%20Meetings%E2%84%A2%20for%20News%3A%20From%20the%20Field%20to%20Broadcast%20in%20Real-Time" rel="noopener noreferrer"&gt;news organizations&lt;/a&gt; can bring remote reporters, field video, and user-generated content into l&lt;a href="https://www.red5.net/solutions/video-streaming-for-broadcast-production/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Red5%20TrueTime%20Meetings%E2%84%A2%20for%20News%3A%20From%20the%20Field%20to%20Broadcast%20in%20Real-Time" rel="noopener noreferrer"&gt;ive production workflows&lt;/a&gt; without losing quality or control using Red5 TrueTime Meetings™.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Zoom, Facetime, Or Teams Do Not Work Well For News Organizations?
&lt;/h2&gt;

&lt;p&gt;Traditional meeting tools like Zoom, Facetime, and Teams were never built for broadcast. They are closed systems with limited control over routing, monitoring, and integration. News teams often end up stitching together multiple tools just to get someone on air. &lt;/p&gt;

&lt;p&gt;There are also concerns about outages and centralized service failures. In &lt;a href="https://nationalcioreview.com/articles-insights/extra-bytes/zoom-back-online-after-major-outage-affects-video-calls-worldwide/" rel="noopener noreferrer"&gt;April 2025&lt;/a&gt;, a global Zoom outage generated tens of thousands of user complaints and disrupted operations for organizations that rely on it for daily communication. For live news production, even a short outage can mean losing a critical moment on air.&lt;/p&gt;

&lt;h2&gt;
  
  
  How Red5 TrueTime Meetings™ Can Help Evolve News Broadcasting?
&lt;/h2&gt;

&lt;p&gt;This is where our open source &lt;a href="https://www.red5.net/truetime/meetings/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Red5%20TrueTime%20Meetings%E2%84%A2%20for%20News%3A%20From%20the%20Field%20to%20Broadcast%20in%20Real-Time" rel="noopener noreferrer"&gt;TrueTime Meetings&lt;/a&gt;™ becomes interesting. Watch the video below to learn more about this solution. &lt;/p&gt;

&lt;p&gt;Red5 TrueTime Meetings™ &lt;strong&gt;runs on Red5 live streaming infrastructure&lt;/strong&gt;, so participants who join through a simple browser link are not just “meeting attendees.” Their streams can function as production-ready inputs that can be routed, recorded, monitored, or integrated into newsroom workflows.&lt;/p&gt;

&lt;p&gt;Perhaps more importantly, as an &lt;strong&gt;open source project&lt;/strong&gt;, TrueTime Meetings™ was designed for developers to take the parts of the code that they need and apply them to their own applications. &lt;/p&gt;

&lt;p&gt;For news environments, that enables several practical scenarios:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Remote reporters joining live from the field through a browser or mobile device.&lt;/li&gt;
&lt;li&gt;Citizen journalists contributing video during breaking news events.&lt;/li&gt;
&lt;li&gt;Guest interviews without relying on consumer conferencing platforms.&lt;/li&gt;
&lt;li&gt;Distributed newsroom coordination across locations.&lt;/li&gt;
&lt;li&gt;Pre-production green rooms before going live.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;What makes this even more powerful is how it works alongside &lt;strong&gt;contribution workflows with partners like Zixi&lt;/strong&gt;. &lt;a href="https://www.red5.net/partners/zixi/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Red5%20TrueTime%20Meetings%E2%84%A2%20for%20News%3A%20From%20the%20Field%20to%20Broadcast%20in%20Real-Time" rel="noopener noreferrer"&gt;Zixi&lt;/a&gt; is widely used in broadcast environments for reliable contribution and transport over IP. When combined with Red5, news organizations can monitor high-fidelity contribution feeds with end-to-end latencies under 250ms while also enabling browser-based contributors through Red5 TrueTime Meetings™. That means professional camera feeds, bonded cellular transmitters, and user-generated streams can all land in the same production environment.&lt;/p&gt;

&lt;p&gt;The integration also enables capabilities that matter directly to news operations:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Real-time monitoring of contribution feeds with high signal integrity.&lt;/li&gt;
&lt;li&gt;Syndicating multiple live sources into multiview layouts for production teams.&lt;/li&gt;
&lt;li&gt;Commercial slating and content replacement during live workflows.&lt;/li&gt;
&lt;li&gt;Secure ingestion and rebroadcast of user-contributed streams.&lt;/li&gt;
&lt;li&gt;Lower operational costs compared to traditional satellite or fiber workflows.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;One example that stands out is breaking news coverage. A newsroom can receive professional camera feeds through contribution infrastructure while simultaneously bringing in eyewitness footage from smartphones through Red5 TrueTime Meetings™. Both sources remain synchronized and production-ready.&lt;/p&gt;

&lt;p&gt;Another important factor is &lt;strong&gt;deployment flexibility&lt;/strong&gt;. Red5 TrueTime Meetings™ can run in &lt;a href="https://www.red5.net/red5-cloud-low-latency-live-streaming-platform/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Red5%20TrueTime%20Meetings%E2%84%A2%20for%20News%3A%20From%20the%20Field%20to%20Broadcast%20in%20Real-Time" rel="noopener noreferrer"&gt;Red5 Cloud&lt;/a&gt; for rapid deployment or on-premises with &lt;a href="https://www.red5.net/red5-pro/low-latency-streaming-software/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Red5%20TrueTime%20Meetings%E2%84%A2%20for%20News%3A%20From%20the%20Field%20to%20Broadcast%20in%20Real-Time" rel="noopener noreferrer"&gt;Red5 Pro&lt;/a&gt; when security, compliance, or infrastructure control are required. And of course, it’s &lt;a href="https://github.com/red5pro/red5-truetime-meetings" rel="noopener noreferrer"&gt;open source&lt;/a&gt;, which allows broadcasters to customize branding and workflows instead of forcing teams into generic meeting interfaces.&lt;/p&gt;

&lt;h2&gt;
  
  
  Try Red5 TrueTime Meetings™ today
&lt;/h2&gt;

&lt;p&gt;Option 1: &lt;/p&gt;

&lt;p&gt;Download the source from GitHub&lt;/p&gt;

&lt;p&gt;Get the open source code, customize the front end, and deploy in your own environment. This is the best path if you want full control and deeper product integration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/red5pro/red5-truetime-meetings" rel="noopener noreferrer"&gt;View on GitHub &amp;gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Option 2:&lt;/p&gt;

&lt;p&gt;Launch it in Red5 Cloud&lt;/p&gt;

&lt;p&gt;Get the open source code, customize the front end, and deploy in your own environment. This is the best path if you want full control and deeper product integration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.red5.net/signup?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Red5%20TrueTime%20Meetings%E2%84%A2%20for%20News%3A%20From%20the%20Field%20to%20Broadcast%20in%20Real-Time" rel="noopener noreferrer"&gt;Sign Up for Red5 Cloud &amp;gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Option 3:&lt;/p&gt;

&lt;p&gt;Deploy with Red5 Pro&lt;/p&gt;

&lt;p&gt;Prefer a self-managed setup? Red5 Pro users can download the source and deploy TrueTime Meetings™ in their own infrastructure, then extend it for their application and workflow needs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.red5.net/contact/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Red5%20TrueTime%20Meetings%E2%84%A2%20for%20News%3A%20From%20the%20Field%20to%20Broadcast%20in%20Real-Time" rel="noopener noreferrer"&gt;Schedule a consultation &amp;gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;What I find most compelling is how convenient it is and how &lt;strong&gt;seamlessly it integrates into existing workflows and consumer applications&lt;/strong&gt;. That is what really accelerates getting new content into the news cycle.&lt;/p&gt;

&lt;p&gt;When reporters, guests, and contributors can join a live broadcast in seconds instead of minutes, &lt;strong&gt;response time improves&lt;/strong&gt;. In news, that speed often translates directly into audience engagement and competitive differentiation.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Red5 TrueTime Meetings™ helps news organizations bring remote contributors into live production workflows without sacrificing quality, speed, or operational control. Instead of relying on consumer meeting platforms, broadcasters can use Red5 TrueTime Meetings™ to ingest browser-based video as production-ready inputs that integrate directly into newsroom infrastructure. This approach allows news teams to move faster, capture breaking moments, and deliver live coverage with professional broadcast reliability.&lt;/p&gt;

</description>
      <category>livestreaming</category>
      <category>software</category>
      <category>learning</category>
      <category>beginners</category>
    </item>
    <item>
      <title>SS4A Grant: Proven Video Streaming Infrastructure for Safer Road Initiatives</title>
      <dc:creator>Maria Artamonova</dc:creator>
      <pubDate>Wed, 18 Mar 2026 19:36:59 +0000</pubDate>
      <link>https://dev.to/red5/ss4a-grant-proven-video-streaming-infrastructure-for-safer-road-initiatives-3mim</link>
      <guid>https://dev.to/red5/ss4a-grant-proven-video-streaming-infrastructure-for-safer-road-initiatives-3mim</guid>
      <description>&lt;p&gt;SS4A grant conversations have been coming up a lot lately in discussions I’ve had with teams that have either applied for or already received funding. What stands out to me is how much opportunity there is to modernize road safety using video streaming infrastructure and real-time intelligence. Today, it is no longer enough to rely on passive monitoring. In this article, I’ll explain how the SS4A Grant Program works, what technology is required, and how Red5’s real-time intelligence can transform traffic monitoring capabilities, video surveillance, and vehicle monitoring.&lt;/p&gt;

&lt;h2&gt;
  
  
  About Safe Streets and Roads for All (SS4A) Grant Program
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://www.transportation.gov/grants/SS4A" rel="noopener noreferrer"&gt;SS4A Grant Program&lt;/a&gt; is a federal initiative led by the U.S. Department of Transportation (DOT) that helps communities reduce roadway fatalities through a system approach to safety. It funds planning, infrastructure, and operational improvements that make streets safer for all users. &lt;/p&gt;

&lt;p&gt;At its core, SS4A supports data-driven safety strategies. Communities use funding to build action plans, deploy infrastructure, and implement technologies that improve outcomes across traffic and vehicle monitoring, and video surveillance systems. The goal is not just collecting more data but enabling real-time intelligence that helps agencies act faster and prevent incidents before they escalate.&lt;/p&gt;

&lt;p&gt;To apply for SS4A funding, organizations must carefully review the &lt;a href="https://www.transportation.gov/sites/dot.gov/files/2025-03/SS4A-FY25-NOFO.pdf" rel="noopener noreferrer"&gt;NOFO&lt;/a&gt;, the Notice of Funding Opportunity. This document outlines all requirements, eligibility criteria, deadlines, and instructions needed to submit a competitive application. It is not optional reading. It is effectively the checklist that determines whether your application will be considered.&lt;/p&gt;

&lt;p&gt;The NOFO explains two main grant types: Planning and Demonstration Grants and Implementation Grants. Planning and Demonstration Grants focus on developing or updating an action plan, running pilot programs, and testing strategies. Implementation grants fund actual deployment of projects identified in an approved action plan. Applicants must choose one path per application and must meet strict requirements. These include developing a comprehensive action plan with defined components such as safety analysis, stakeholder engagement, strategy selection, and measurable outcomes.&lt;/p&gt;

&lt;p&gt;Your application must include detailed documentation, including a plan template, supporting materials, and a clear description of how your project aligns with SS4A priorities. Deadlines are fixed and must be followed precisely. &lt;/p&gt;

&lt;h2&gt;
  
  
  What Do You Need From A Technology Standpoint
&lt;/h2&gt;

&lt;p&gt;Most cities today already have cameras, sensors, speed cameras, &lt;a href="https://www.red5.net/blog/drone-live-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SS4A%20Grant%3A%20Proven%20Video%20Streaming%20Infrastructure%20for%20Safer%20Road%20Initiatives" rel="noopener noreferrer"&gt;drones&lt;/a&gt;, and connected intersections generating massive amounts of data. The issue is not data collection. It is that today’s &lt;a href="https://www.red5.net/solutions/video-streaming-for-traffic-monitoring/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SS4A%20Grant%3A%20Proven%20Video%20Streaming%20Infrastructure%20for%20Safer%20Road%20Initiatives" rel="noopener noreferrer"&gt;traffic monitoring systems&lt;/a&gt; are delayed, siloed, difficult to scale across districts and departments, and rarely actionable in real time. On top of that, they often lack advanced AI and vision-language model analysis of live camera feeds that could automatically surface risks, detect patterns, and turn raw video into meaningful operational insight. What cities actually need is real-time roadway intelligence. That gap is exactly where modern streaming infrastructure can make a difference.&lt;/p&gt;

&lt;p&gt;Modern SS4A projects require video streaming infrastructure that can ingest live feeds from video surveillance systems, drone operations, and mobile devices without replacing existing infrastructure. Video must be processed in real-time, apply AI models, and deliver insights instantly.&lt;/p&gt;

&lt;p&gt;With the right approach, agencies can detect license plates, identify anomalies, and generate real-time threat intelligence from live feeds. Instead of reviewing footage after an incident, teams can act while events are unfolding.&lt;/p&gt;

&lt;p&gt;This shift enables:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Faster detection of accidents, congestion, and risks involving vulnerable road users.&lt;/li&gt;
&lt;li&gt;Real-time intelligence across departments using shared live video.&lt;/li&gt;
&lt;li&gt;Better coordination between emergency responders and traffic operators.&lt;/li&gt;
&lt;li&gt;Scalable vehicle monitoring across jurisdictions.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Without this foundation, SS4A-funded projects risk becoming another layer of disconnected systems rather than a unified safety solution.&lt;/p&gt;

&lt;h2&gt;
  
  
  How Red5 Can Help
&lt;/h2&gt;

&lt;p&gt;We provide &lt;a href="https://www.red5.net/products/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SS4A%20Grant%3A%20Proven%20Video%20Streaming%20Infrastructure%20for%20Safer%20Road%20Initiatives" rel="noopener noreferrer"&gt;video streaming infrastructure&lt;/a&gt; designed for real-time intelligence at scale. The platform ingests video from traffic cameras, drones, or even mobile devices, ingest it using &lt;a href="https://www.red5.net/video-streaming-technology/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SS4A%20Grant%3A%20Proven%20Video%20Streaming%20Infrastructure%20for%20Safer%20Road%20Initiatives" rel="noopener noreferrer"&gt;standard protocols&lt;/a&gt; without replacing existing infrastructure, process it in the cloud with AI (for detecting and alerting anomalies), and deliver sub-second live video to operators and agencies that need to act immediately. &lt;/p&gt;

&lt;p&gt;When you move from delayed analytics to live situational awareness, several things change:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Risks can be detected sooner, including identifying accidents, congestion, wildfire events, dangerous intersections, or vulnerable road users.&lt;/li&gt;
&lt;li&gt;Emergency response coordination improves because everyone sees the same live view, helps identify stolen vehicles in real-time, and enables faster incident verification before dispatch.&lt;/li&gt;
&lt;li&gt;Cross-agency collaboration becomes practical instead of theoretical.&lt;/li&gt;
&lt;li&gt;Safety outcomes can actually be measured for programs like the U.S. Department of Transportation’s Safe Streets and Roads for All initiative..&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;One thing that stands out about Red5’s approach is that it does not require ripping out existing infrastructure. Cities can use cameras they already have and turn it into something operationally meaningful.&lt;/p&gt;

&lt;h2&gt;
  
  
  Our Customers Success Stories
&lt;/h2&gt;

&lt;p&gt;We have seen this in real-world deployments. &lt;/p&gt;

&lt;p&gt;For example, Red5 supported a &lt;a href="https://www.red5.net/case-studies/real-time-drone-streaming-in-san-diego/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SS4A%20Grant%3A%20Proven%20Video%20Streaming%20Infrastructure%20for%20Safer%20Road%20Initiatives" rel="noopener noreferrer"&gt;real-time drone streaming solution&lt;/a&gt; for the &lt;a href="https://www.sdsheriff.gov/" rel="noopener noreferrer"&gt;San Diego County Sheriff’s Department&lt;/a&gt;, enabling faster response during critical incidents. &lt;/p&gt;

&lt;p&gt;In another case, Red5 worked with &lt;a href="https://www.red5.net/partners/nomad/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SS4A%20Grant%3A%20Proven%20Video%20Streaming%20Infrastructure%20for%20Safer%20Road%20Initiatives" rel="noopener noreferrer"&gt;Nomad Media&lt;/a&gt; to power a &lt;a href="https://www.red5.net/case-studies/caltrans-district-7-reduced-video-surveillance-latency-by-29-seconds-using-red5/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SS4A%20Grant%3A%20Proven%20Video%20Streaming%20Infrastructure%20for%20Safer%20Road%20Initiatives" rel="noopener noreferrer"&gt;traffic monitoring solution&lt;/a&gt; for &lt;a href="https://dot.ca.gov/caltrans-near-me/district-7" rel="noopener noreferrer"&gt;Caltrans District 7&lt;/a&gt;, where sub-250ms latency video improves decision-making during fires, disasters, and large-scale incidents. We provide a unified interface for viewing and sharing live and recorded roadway video across agencies including the &lt;a href="https://www.chp.ca.gov/" rel="noopener noreferrer"&gt;California Highway Patrol&lt;/a&gt;. Faster visibility leads to faster decisions, and faster decisions save lives.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;SS4A Grant initiatives are pushing cities to rethink how they use data, moving from passive monitoring to real-time intelligence powered by video streaming infrastructure. Today, the gap between data collection and action is the biggest challenge in road safety. By adopting modern traffic monitoring software, video surveillance, and vehicle monitoring solutions, agencies can turn SS4A funding into measurable safety outcomes. If your organization has applied for or received SS4A funding, feel free to &lt;a href="https://www.red5.net/contact/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SS4A%20Grant%3A%20Proven%20Video%20Streaming%20Infrastructure%20for%20Safer%20Road%20Initiatives" rel="noopener noreferrer"&gt;reach out to us&lt;/a&gt;. &lt;/p&gt;

</description>
      <category>livestreaming</category>
      <category>software</category>
      <category>learning</category>
      <category>beginners</category>
    </item>
    <item>
      <title>What Can Real-Time In-Stadium Streaming Do for Live Sports Broadcasting and Event Production in 2026?</title>
      <dc:creator>Maria Artamonova</dc:creator>
      <pubDate>Mon, 16 Mar 2026 18:00:30 +0000</pubDate>
      <link>https://dev.to/red5/what-can-real-time-in-stadium-streaming-do-for-live-sports-broadcasting-and-event-production-in-3hnd</link>
      <guid>https://dev.to/red5/what-can-real-time-in-stadium-streaming-do-for-live-sports-broadcasting-and-event-production-in-3hnd</guid>
      <description>&lt;p&gt;There’s a quiet revolution happening in stadiums and arenas. And it’s not just about 5G or giant LED screens. It’s about real-time video, interactive features, and creating unforgettable in-venue experiences that extend far beyond the walls of the stadium. In this study, you’ll learn what in-stadium streaming can do for live sports broadcasting and event production in large venues, why it is becoming a game-changer for both fans and businesses, and what innovative solutions Red5 and partners provide.&lt;/p&gt;

&lt;h2&gt;
  
  
  What Is In-Stadium Streaming?
&lt;/h2&gt;

&lt;p&gt;At Red5, we’ve been working on revolutionizing &lt;a href="https://www.red5.net/solutions/stadium-live-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=What%20Can%20Real-Time%20In-Stadium%20Streaming%20Do%20for%20Live%20Sports%20Broadcasting%20and%20Event%20Production%20in%202026%3F" rel="noopener noreferrer"&gt;stadium experiences&lt;/a&gt; with the following partners. &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Together, Red5 Pro‘s real-time streaming server software and &lt;a href="https://www.red5.net/partners/amino/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=What%20Can%20Real-Time%20In-Stadium%20Streaming%20Do%20for%20Live%20Sports%20Broadcasting%20and%20Event%20Production%20in%202026%3F" rel="noopener noreferrer"&gt;Amino&lt;/a&gt;’s media players and device management capabilities power &lt;a href="https://www.red5.net/case-studies/red5-amino-transform-real-time-streaming-with-media-players/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=What%20Can%20Real-Time%20In-Stadium%20Streaming%20Do%20for%20Live%20Sports%20Broadcasting%20and%20Event%20Production%20in%202026%3F" rel="noopener noreferrer"&gt;on-premise streaming&lt;/a&gt; for large-scale venues like stadiums. We are currently working on a large-scale pilot with a customer in Mexico that uses this setup to stream live sports to TVs across stadiums with synchronized, low-latency video on 15–20 screens in view. It scales to thousands of endpoints without heavy infrastructure, while reducing operational costs.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.red5.net/partners/the-famous-group/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=What%20Can%20Real-Time%20In-Stadium%20Streaming%20Do%20for%20Live%20Sports%20Broadcasting%20and%20Event%20Production%20in%202026%3F" rel="noopener noreferrer"&gt;The Famous Group&lt;/a&gt;, on the other hand, is flipping the camera around by &lt;a href="https://www.red5.net/case-studies/the-famous-group-modernized-vixi-suite-fan-streaming-with-red5-at-large-scale-venues/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=What%20Can%20Real-Time%20In-Stadium%20Streaming%20Do%20for%20Live%20Sports%20Broadcasting%20and%20Event%20Production%20in%202026%3F" rel="noopener noreferrer"&gt;streaming fan content from phones&lt;/a&gt; directly to those same digital signs in real time. Think live shoutouts, dance cams, and on-the-fly interactions that make the crowd part of the show.&lt;/li&gt;
&lt;li&gt;Another exciting partnership we’re working on is combining on-prem deployment with &lt;a href="https://www.red5.net/partners/osprey/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=What%20Can%20Real-Time%20In-Stadium%20Streaming%20Do%20for%20Live%20Sports%20Broadcasting%20and%20Event%20Production%20in%202026%3F" rel="noopener noreferrer"&gt;Osprey&lt;/a&gt; encoders. With frame-level encoding they provide and Red5 deployed on a server on-site, we’ve seen glass to glass latencies as low as 60ms. That aligns (or often beats) with the speed of sound depending on where you are situated in the stadium. So the video on the screen matches what your ears hear, which has been sorely missing at concerts and games alike. This is particularly important with live concerts and getting lip-sync dialed in on the video playback with the sound in the arena.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  What AI-Powered Capabilities Red5 Delivers
&lt;/h3&gt;

&lt;p&gt;Here are some of the AI-powered capabilities Red5 delivers for enhancing sports and event in-stadium streaming experiences.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Frame-level detection:&lt;/strong&gt; Extracting video frames in under a second and handing them off to AI models. This makes it possible to flag large crowd gatherings and enable routing at mass events to ensure safety, track player movements in real time, and detect unauthorized access to restricted areas in a stadium.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Audio intelligence:&lt;/strong&gt; Using models like NVIDIA’s Parakeet for live transcription, translation, and generating searchable metadata from conversations or broadcasts. Imagine live sports commentators’ play-by-play calls being instantly transcribed and translated, or concert lyrics synced in real time for fans across multiple languages.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Smart search:&lt;/strong&gt; Generate searchable metadata from video or audio content and automatically tag key events, such as goals and penalties in sports, so they can be quickly located in the recording.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Custom advertising:&lt;/strong&gt; Align ads with the tone of the content to ensure higher relevancy and conversion rates. For example, you could display a sportswear commercial right after a major goal replay, or showcase upcoming tour dates immediately following a concert highlight.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  How These Solutions Benefit Both Fans and Businesses
&lt;/h3&gt;

&lt;p&gt;What excites me most about real-time stadium streaming is the value it unlocks for businesses:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Enhance &lt;a href="https://www.red5.net/solutions/video-streaming-for-fan-engagement/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=What%20Can%20Real-Time%20In-Stadium%20Streaming%20Do%20for%20Live%20Sports%20Broadcasting%20and%20Event%20Production%20in%202026%3F" rel="noopener noreferrer"&gt;fan engagement&lt;/a&gt; with multi-view angles that let attendees follow the game their way.&lt;/li&gt;
&lt;li&gt;Expand your audience and drive customer loyalty by adding new high-value capabilities. Deliver broadcast-quality video and perfectly synced audio with the live action. Offer personalized in-app audio with selectable commentators and languages.&lt;/li&gt;
&lt;li&gt;Monetize and earn more revenue through interactive overlays and &lt;a href="https://www.red5.net/whitepapers/red5-server-side-ad-insertion/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=What%20Can%20Real-Time%20In-Stadium%20Streaming%20Do%20for%20Live%20Sports%20Broadcasting%20and%20Event%20Production%20in%202026%3F" rel="noopener noreferrer"&gt;targeted ads&lt;/a&gt; tailored to each fan’s location and profile.&lt;/li&gt;
&lt;li&gt;Reduce infrastructure costs by deploying on prem, on your own cloud account, or using &lt;a href="https://www.red5.net/red5-cloud-low-latency-live-streaming-platform/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=What%20Can%20Real-Time%20In-Stadium%20Streaming%20Do%20for%20Live%20Sports%20Broadcasting%20and%20Event%20Production%20in%202026%3F" rel="noopener noreferrer"&gt;Red5 Cloud&lt;/a&gt;’s Pay-As-You-Grow service.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;While &lt;a href="https://www.red5.net/solutions/sports-live-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=What%20Can%20Real-Time%20In-Stadium%20Streaming%20Do%20for%20Live%20Sports%20Broadcasting%20and%20Event%20Production%20in%202026%3F" rel="noopener noreferrer"&gt;sports&lt;/a&gt; and music use cases are some of the most obvious, fan festivals, amusement parks, eSports &lt;a href="https://www.red5.net/case-studies/red5-cloud-zixi-real-time-monitoring-and-streaming-solution-for-live-event-production/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=What%20Can%20Real-Time%20In-Stadium%20Streaming%20Do%20for%20Live%20Sports%20Broadcasting%20and%20Event%20Production%20in%202026%3F" rel="noopener noreferrer"&gt;events&lt;/a&gt;, and other use cases could benefit from this kind of infrastructure. And here’s where it gets even more exciting: this isn’t just for the world’s most connected stadiums.&lt;/p&gt;

&lt;p&gt;In many emerging markets, connectivity inside venues still lags far behind. Public internet can be slow or unreliable, and building centralized infrastructure often feels out of reach. But it does not have to be. By deploying Red5 on-prem, paired with private 5G or dedicated Wi-Fi networks, venues can bypass the limitations of external connectivity altogether. You get the benefits of real-time streaming, synchronized playback, and interactive experiences, all running locally on your own hardware.&lt;/p&gt;

&lt;p&gt;This approach is already proving effective in places like Mexico, where teams are using this model to deliver broadcast-quality experiences without needing massive cloud infrastructure or robust public internet access. It’s a way for stadiums in under-connected regions to leap ahead, not wait to catch up.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;It feels like the time is finally right, the networks are ready, and the tech is here for stadiums. And the demand for immersive, interactive, real-time experiences is only growing.&lt;/p&gt;

&lt;p&gt;In-stadium streaming is redefining how fans experience live events and how venues unlock new revenue opportunities. From synchronized, &lt;a href="https://www.red5.net/blog/what-is-ultra-low-latency-why-does-it-matter/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=What%20Can%20Real-Time%20In-Stadium%20Streaming%20Do%20for%20Live%20Sports%20Broadcasting%20and%20Event%20Production%20in%202026%3F" rel="noopener noreferrer"&gt;ultra-low latency&lt;/a&gt; video to interactive tools like kiss cams and fan shoutouts, the technology enhances engagement while lowering infrastructure costs. Whether in the world’s most connected arenas or in emerging markets, Red5 and its partners are proving that the future of live event broadcasting is already here.&lt;/p&gt;

</description>
      <category>livestreaming</category>
      <category>software</category>
      <category>learning</category>
      <category>beginners</category>
    </item>
    <item>
      <title>TrueTime Meetings™: Open Source Video Calling Built for Real-Time Streaming</title>
      <dc:creator>Maria Artamonova</dc:creator>
      <pubDate>Fri, 13 Mar 2026 13:50:57 +0000</pubDate>
      <link>https://dev.to/red5/truetime-meetings-video-calling-built-for-real-time-streaming-11p4</link>
      <guid>https://dev.to/red5/truetime-meetings-video-calling-built-for-real-time-streaming-11p4</guid>
      <description>&lt;p&gt;The world has reached a turning point in internet video engagement where the ubiquitous need for instant access to interactive audio/video communications can’t be satisfied by the usual approaches to video conferencing. &lt;/p&gt;

&lt;p&gt;Good as they might be at supporting traditional video conferencing, and even there they leave a lot to be desired, the likes of Google Meet, Zoom, Microsoft Teams, Cisco Webex , etc. fall far short of what’s required as a new era in live video streaming takes hold. To put it bluntly, they’re simply not designed to accommodate a world where spontaneous face-to-face social engagement is becoming an intrinsic component of live streaming across the consumer, enterprise and institutional landscapes. See our latest &lt;a href="https://www.red5.net/whitepapers/trends-in-streamed-video-calling/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;white paper&lt;/a&gt; for an examination of the major trends driving the need for a new video communications paradigm.&lt;/p&gt;

&lt;p&gt;Of course, the more ubiquitous demand for infrastructure supporting such capabilities becomes, the more needs to be done to ensure that demand is met. Which is why, along with platforms like Red5’s Experience Delivery Network (&lt;a href="https://www.red5.net/blog/xdn-architecture-traditional-cdns-need-not-apply/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;XDN&lt;/a&gt;) Architecture that heavily rely on WebRTC and its enhancements for current transport needs, there’s an industry-wide standard informally known as &lt;a href="https://www.red5.net/media-over-quic-moq/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;MOQ&lt;/a&gt; nearing competition under the guidance of the Internet Engineering Task Force. With widescale industry engagement including full support from Red5, MOQ is set to drive faster emergence of a global marketplace connected by real-time interactive streaming. &lt;/p&gt;

&lt;p&gt;As next-gen streaming infrastructures take hold, it’s clear that from now on the live mass market streaming and video calling components must act as a single piece as opposed to remaining locked into the bifurcation that currently persists. This requires a video meeting platform built on global real-time streaming architecture that can support point-and-click activation of fully synchronized unscheduled interactive A/V communications in any use case reaching any number of users at 250ms or lower end-to-end latencies. &lt;/p&gt;

&lt;p&gt;We at Red5 met this challenge by introducing a set of &lt;a href="https://www.red5.net/open-source-live-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;open-source software&lt;/a&gt; modules comprising the &lt;a href="https://www.red5.net/truetime/meetings/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;TrueTime Meetings&lt;/a&gt;™ toolset. Now service and applications providers can make A/V communications an intrinsic, readily accessible part of live streaming UX at these performance levels in virtually any scenario.&lt;/p&gt;

&lt;p&gt;The options start with TrueTime Meetings™ default parameters supporting a more scalable, higher quality version of a conventional conferencing solution, with or without appointment requirements. Out of the box, TrueTime Meetings™ is more or less a Google Meet clone in terms of functionality with features like live transcriptions, save recordings, virtual background replacement, and more. Those looking to customize a solution to meet their goals can leverage the flexibility of &lt;a href="https://github.com/red5pro/red5-truetime-meetings" rel="noopener noreferrer"&gt;this open source software stack&lt;/a&gt; to implement whatever approaches to real-time A/V communications fit their &lt;a href="https://www.red5.net/solutions/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;use cases&lt;/a&gt;. Relying on in-house and/or Red5 development teams, they can create their own ground-breaking communications environments from a deep well of open-sourced TrueTime Meetings™ functionalities together with any of the tools available with &lt;a href="https://www.red5.net/truetime/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;other TrueTime solutions&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;TrueTime Meetings™ tools are available like the other Red5 TrueTime toolsets at no added costs beyond the usage-based pricing that makes the &lt;a href="https://www.red5.net/red5-cloud-low-latency-live-streaming-platform/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;Red5 Cloud managed platform-as-a-service&lt;/a&gt; (PaaS) and the self-managed &lt;a href="https://www.red5.net/red5-pro/low-latency-streaming-software/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;Red5 Pro&lt;/a&gt; implementations of the company’s Experience Delivery Network (XDN) Architecture a cost-effective approach to &lt;a href="https://www.red5.net/blog/real-time-data-streaming-for-live-video/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;real-time interactive streaming&lt;/a&gt;. In addition, &lt;a href="https://www.red5.net/case-studies/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;Red5 customers&lt;/a&gt; have access to major advancements that are shaping the next generation in interactive video communications. These include support for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AI-assisted Large Language Model (LLM) applications ranging from real-time analytics and voice-to-text and text-to-voice language translation to execution of the creative capabilities driven by AI Vision Language Models (VLMs).&lt;/li&gt;
&lt;li&gt;A/V connectivity with use of &lt;a href="https://www.red5.net/h-265-hevc/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;HEVC&lt;/a&gt; and, soon, &lt;a href="https://www.red5.net/av1/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;AV1&lt;/a&gt; to attain 4K quality levels and beyond;&lt;/li&gt;
&lt;li&gt;Spatially realistic audio and video experiences with immersive tie-ins to extended reality (XR) applications.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Unique Capabilities of the TrueTime Meetings™ Platform
&lt;/h2&gt;

&lt;p&gt;With TrueTime Meetings™ we’re addressing the need for a video meetings platform with global reach that can be easily configured to accommodate the vast range of socialization strategies that are becoming more and more expected in modern live video streaming experiences. &lt;/p&gt;

&lt;p&gt;While the need for capabilities akin to what’s available from conventional video conferencing platforms will persist, a rapidly expanding share of the interactive video communications strategies coming into play require what’s not available from the conventional platforms, including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;adaptability to any strategy,&lt;/li&gt;
&lt;li&gt;unlimited scalability,&lt;/li&gt;
&lt;li&gt;ease of use supporting instantaneous appointment-free participation,&lt;/li&gt;
&lt;li&gt;A/V quality levels matched to current expectations, and&lt;/li&gt;
&lt;li&gt;support for the new technologies reshaping the communications landscape, from AI to innovations like LiDAR, spatial audio and the many permutations of AR and VR.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For the first time, we have made it possible to meet all these requirements with the freedom and cost benefits that come with using the open-source frontend code base underlying the tools and SDKs comprising TrueTime Meetings™. The license-free possibilities range from a default mode that adds unlimited scalability and high A/V quality for users requiring a conventional video conferencing platform to easily structured combinations of TrueTime Meetings™ elements in support of any use case.&lt;/p&gt;

&lt;p&gt;Critically, the tools and functionalities comprising TrueTime Meetings™ don’t entail any kind of forklift departure from the fundamental capabilities that have long characterized XDN Architecture as a foundation for real-time interactive streaming. Abundant &lt;a href="https://www.red5.net/docs/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;documentation&lt;/a&gt; on our website describes the full range of XDN capabilities that go into enabling real-time streaming in any direction from any number of originating sources to any number of receivers with end-to-end latencies at or below 250ms over any distance. See, for example, these blogs on the &lt;a href="https://www.red5.net/blog/6-big-reasons-red5s-webrtc-platform-outshines-all-others/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;keys to the XDN performance advantage over all other real-time streaming platforms&lt;/a&gt;, the role Red5 is playing supporting &lt;a href="https://www.red5.net/blog/the-webrtc-revolution-how-red5-and-its-partners-are-powering-interactive-video/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;distributed collaboration in live production&lt;/a&gt;, and the capabilities achieved in collaboration with &lt;a href="https://www.red5.net/blog/how-the-red5-ecosystem-is-redefining-real-time-streaming-with-webrtc/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;Red5 partners&lt;/a&gt; whose solutions have been integrated to work with XDN Architecture.&lt;/p&gt;

&lt;p&gt;Mirroring the convenience that comes with building apps using our other &lt;a href="https://www.red5.net/products/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;toolsets&lt;/a&gt;, including the TrueTime MultiView™, Watch Party™, Studio™ and DataSync™ toolsets Red5 created to facilitate building specific applications on XDN infrastructures, TrueTime Meetings™ consists of a stack of open-source software tools with easy-to-follow guidance on how to use them on a customer portal, ensuring enough functionalities are embodied in the stack to take video communications anywhere a customer wants to go.&lt;/p&gt;

&lt;p&gt;All TrueTime Solutions™ are available to XDN customers at no extra costs, which in the case of TrueTime Meetings™ applies whether customers choose to utilize the tools as structured in the feature-rich video conferencing default mode or to use them to build interactive communications systems precisely tailored to their use cases. Red5 Cloud customers can customize their uses with their own branding through direct access to Meetings in the TrueTime Apps section on their Red5 Cloud dashboards. &lt;/p&gt;

&lt;p&gt;For those deploying in their own environments, Red5 Pro customers can put TrueTime Meetings™ to use by downloading whatever tools they need from the &lt;a href="https://www.google.com/url?q=https://github.com/red5pro/red5-truetime-meetings&amp;amp;sa=D&amp;amp;source=docs&amp;amp;ust=1773335120081172&amp;amp;usg=AOvVaw04k2UU-NAd-8u-nr-hoXes" rel="noopener noreferrer"&gt;TrueTime apps repository&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;In all cases, customized usage is facilitated through the &lt;a href="https://www.red5.net/docs/red5-cloud/development/sdks/conference-sdk/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;Red5 Pro Conference SDK&lt;/a&gt;, which provides a high-level API for building video calling applications while abstracting the complexities of WebRTC management, media handling, and connection management. Settings applied with custom builds can be modified, added or deleted without requiring a full application rebuild.&lt;/p&gt;

&lt;p&gt;Red5 Pro users can launch real-time streaming with customized applications enabled by TrueTime toolsets, including Meetings, using the free Java-coded &lt;a href="https://www.red5.net/red5-media-server/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;Red5 Media Server&lt;/a&gt; with its simple plugin architecture, which has been installed by the likes of Amazon, the U.S. Department of Defense, Akamai, Harvard University and many other entities in over one million instances of XDN usage worldwide. Or they can use their own server software to accomplish the same things. &lt;/p&gt;

&lt;p&gt;Red5 Cloud and Red5 Pro customers can integrate into their existing workflows the pre-packaged TrueTime Meetings™ tools comprising the default video calling mode or they can integrate whichever tools they need to build their own custom calling applications. This helps to ensure that whatever approaches they take, the calling functions work in holistic alignment with their real-time streaming operations. &lt;/p&gt;

&lt;p&gt;TrueTime Meetings™ users also benefit from &lt;a href="https://www.red5.net/blog/ai-in-live-streaming/#how-ai-can-be-utilized-in-live-streaming?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;our latest support for AI&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;AI-assisted multilingual speech-to-text captioning and large language model (LLM) multilingual generation of speech from text are included in the TrueTimes MeetingsTM default mode with the Red5 Cloud and Red5 Pro Enterprise subscription tiers. They are also available on a per-user negotiated cost or no-cost basis with the Red5 Cloud &lt;a href="https://www.red5.net/red5-cloud-pricing/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;Pay-as-You-Grow and Growth&lt;/a&gt; plans and the Red5 Pro &lt;a href="https://www.red5.net/pricing/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;Developer Pro, Start-Up and Growth Pro&lt;/a&gt; plans. &lt;/p&gt;

&lt;p&gt;More broadly, use of TrueTime Meetings™ is enhanced by virtue of Red5’s integration of XDN Architecture with an &lt;a href="https://www.red5.net/blog/boundless-possibilities-revealed-in-ai-integrations-with-real-time-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;ever-expanding array of LLMs and VLMs&lt;/a&gt;, which are available to work with any Meeting use case. At the same time, customers can arrange for integration of any other AI models that employ open interfaces compatible with the industry standards used with Red5 APIs.&lt;/p&gt;

&lt;p&gt;Adding to the AI functionalities available to TrueTime Meetings™ users, for the first time anywhere we’re &lt;a href="https://www.red5.net/blog/ai-detection-is-set-to-transform-live-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;enabling virtually instant extraction of A/V frames&lt;/a&gt; at any frequency down to sub-second intervals from content delivered to Red5 Cloud infrastructure over any supported ingest, including &lt;a href="https://www.red5.net/webrtc/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;WebRTC&lt;/a&gt; / &lt;a href="https://www.red5.net/whip-and-whep/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;WHIP&lt;/a&gt;, &lt;a href="https://www.red5.net/srt-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;SRT&lt;/a&gt;, &lt;a href="https://www.red5.net/zixi-protocol/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;Zixi&lt;/a&gt;, &lt;a href="https://www.red5.net/rtmp-server/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;RTMP&lt;/a&gt;, and &lt;a href="https://www.red5.net/rtsp-protocol/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;RTSP&lt;/a&gt;. This is vital to thorough analysis of payloads in surveillance operations and in any other real-time streaming scenarios involving granular enhancements to metadata, avoiding unwanted content elements and monitoring user behavior.&lt;/p&gt;

&lt;h3&gt;
  
  
  TrueTime Meetings™ Video Calling Default Mode
&lt;/h3&gt;

&lt;p&gt;For customers who simply want to be able to support a better version of conventional video conferencing systems, the TrueTime Meetings™ default mode is easily activated by responding to prompts on their dashboards. The automated video calling setup creates an easy-to-manage conferencing adjunct to any streaming scenario. &lt;/p&gt;

&lt;p&gt;The platform can be used as a standalone center for appointment-based group meetings or as a video communications overlay integrated for access as an appointment-free component of the customer’s real-time streaming UX. TrueTime Meetings™ can also be activated with a seamless transfer from an HTTP-streamed &lt;a href="https://www.red5.net/solutions/sports-live-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;live sports&lt;/a&gt; or &lt;a href="https://www.red5.net/case-studies/red5-cloud-zixi-real-time-monitoring-and-streaming-solution-for-live-event-production/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;other event&lt;/a&gt; to real-time XDN streaming to support viewer participation in video conversations during post-event programming.&lt;/p&gt;

&lt;p&gt;Whatever approach is taken in the use of XDN Architecture, TrueTime Meetings™ allows viewers, remote commentators or any other category of collaborators and participants to be brought into video conversations associated with webinars, trade shows. live sports and &lt;a href="https://www.red5.net/blog/truetime-meetings-for-news/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;newscasts&lt;/a&gt;, esports competitions, game shows, &lt;a href="https://www.red5.net/blog/red5-cloud-integrates-pubnub-to-deliver-interactivity-intelligence-global-scalability-for-real-time-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;live-shopping programs&lt;/a&gt; – the list is endless. And the platform can be used in audio-only group meeting applications as in the case of live-streamed talk radio programs and audio podcasts.&lt;/p&gt;

&lt;p&gt;Some of the features included with the core default video conferencing mode include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Scalability –&lt;/strong&gt; Multiuser participation with up to 30 on-screen users at any one time with managed access available to any number of others extending into the millions as Meetings users are rotated in and out of a given streaming session.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Participation Monitoring in Appointment-Free Applications –&lt;/strong&gt; Conferencing managers’ ability to track what’s happening on the platform is facilitated by automated participant updates when users enter or leave a session and by signals that let organizers know when someone turns a camera on or off or mutes or unmutes a microphone.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Video Quality Control –&lt;/strong&gt; The &lt;a href="https://www.red5.net/h264/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;H.264&lt;/a&gt; default encoding used with TrueTime MeetingsTM can support any level of screen resolution through management of the bandwidth allocated to the A/V streams. Whereas conventional conferencing platforms typically top out at 720p with some charging premium fees to enable HD 1080p, display resolutions in TrueTime Meetings™ sessions can surge to 1080p HD and beyond based on whatever bandwidth is made available by session managers. Side-by-side comparisons with conventional streaming show that at any given bandwidth allocation, image clarity in the Meetings display windows is much greater than the alternative. Moreover, the quality is persistent with the real-time streaming support provided by XDN architecture, which frequently isn’t the case with the freezes and fluctuations common to HTTP-based streaming.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Audio Control –&lt;/strong&gt; Along with benefitting from superior consistency in audio performance mirroring what’s achieved with video on the TrueTime Meetings™ real-time XDN streaming infrastructure, session managers can keep tabs on user-controlled audio levels to ensure satisfactory UX on the part of all participants. Exploiting the Meeting software client’s periodic registering of each participant’s audio level, managers can activate a “talking indicator” to display how audio is playing out in conference conversations.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Support for Text and Emoji Messaging, Screen Sharing, Hand Raising and Meeting Recordings –&lt;/strong&gt; All of these functionalities are built into the default platform, eliminating the need to engage third-party services or in-house developers.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Layout Flexibility –&lt;/strong&gt; Operating in auto mode, the TrueTime Meetings™ platform adjusts the display layout based on the participant count. Alternatively, session managers have other options allowing them to set up tiled layouts that allocate equal-sized video tiles to all participants or implement sidebar layouts where the main speaker is prominently displayed with others in a participant sidebar. They can also choose hands-on control over shifts in participant focus throughout the session.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;UI Styling –&lt;/strong&gt; The TrueTimes Meetings™ software stack includes a broad range of styling and branding options that can be easily implemented on user dashboards.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Custom Use of TrueTime Meetings™
&lt;/h3&gt;

&lt;p&gt;Beyond default mode TrueTime Meetings™ can be configured in unique ways as an integral part of any real-time streaming scenario to enable usage formats that depart from the traditional video conferencing structure. Of course, all the capabilities enumerated above for the default mode can be applied in customized applications as well.&lt;/p&gt;

&lt;p&gt;Here is where TrueTime Meetings™ meets the full scope of challenges in the new live streaming era, unleashing the unlimited possibilities that emerge when interactive video communications and multicast streams share the same real-time infrastructure. In this hybrid environment any viewer in a live-streamed video audience of any size extending into the millions can seamlessly join in interactive video communications as configured on the TrueTimes Meetings™ platform. &lt;/p&gt;

&lt;p&gt;In commercial operations ready-to-deploy face-to-face group interactions can be implemented with real-time XDN streaming in support of dispersed collaboration across a vast range of pursuits, from live sports productions, coordinated engagements of public safety units in emergency surveillance, engineering and architectural design and much else in the workaday world to every type of education and training environment. Anyone anywhere can be connected into the global Red5 Cloud XDN for a role as commentator or influencer in any of these use cases. &lt;/p&gt;

&lt;p&gt;Adding to the cost-free versatility, our customers can work with other TrueTime Solutions™ integrated in their workflows to enhance their video calling with additional features related to multiviewing, watch parties, live production and use of telemetric data feeds in sports streaming and surveillance. Moreover, TrueTime Meetings™ users can integrate interactive video communications with the interactive data streaming capabilities enabled by the partnership between Red5 and global data platform operator &lt;a href="https://www.red5.net/partners/pubnub/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;PubNub&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Aggregations of meeting participants can be displayed anywhere. For example, our customers might prefer to put the TrueTime Meetings™ participation grid on display in a venue setting that makes users part of what’s transpiring at the location. LED walls used with live-streamed game shows, live shopping, reality TV, sports shows with call-in formats, and much else can display large numbers of participants while prominently projecting speakers chosen by the host. &lt;/p&gt;

&lt;p&gt;Or maybe there’s no interest in creating multi-participant grids at all, as when customers simply want to accord a video presence to individuals speaking in Q&amp;amp;A sessions during corporate earnings calls and other types of meetings. Or, in another example of unitary displays involving big venues like sports stadiums, in-venue and remote smartphone users alike can participate in a mass shared experience where video generated from anyone’s phone at any moment can be displayed on the stadium screens, as is the case in multiple locations where &lt;a href="https://www.red5.net/case-studies/the-famous-group-modernized-vixi-suite-fan-streaming-with-red5-at-large-scale-venues/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;The Famous Group is bringing such experiences alive in partnership with Red5&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Looking at another use case, ever since the Covid Pandemic, there’s been an interest in adding virtualized functionalities to remote participants in webinars and location-based trade shows, but the attempts at supporting such use cases proved to be too cumbersome and costly to take hold. Now it’s a different story for our XDN customers who want to use the open-source TrueTime Meetings™ tools to create multi-dimensional virtual extensions with their events featuring things like private rooms for booked meetings between remote attendees and vendors. &lt;/p&gt;

&lt;h3&gt;
  
  
  The Spatial Audio Option
&lt;/h3&gt;

&lt;p&gt;There’s an unmatched level of realism that can be infused into group social interactions enabled by our support for &lt;a href="https://www.red5.net/blog/spatial-audio-webrtc/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;spatial audio&lt;/a&gt;. In all situations involving virtualized group interactions, including virtual casinos, virtual sports bars and other online social meeting places, customers who utilize TrueTime Meetings™ can benefit from the value-added spatial audio solution whether or not the use of XR headgear is involved.&lt;/p&gt;

&lt;p&gt;We have taken immersive spatial audio to a new level of versatility and scalability by exploiting unique characteristics of XDN Architecture to maintain real-time configurations of realistic audio experiences in 3D space no matter how many users are engaged or how complex the ambient environment might be. Individualized mixes of ambient sound are applied to realistically reflect each user’s position in the virtual space while personal conversational sound levels are raised with one-on-one communications.&lt;/p&gt;

&lt;p&gt;A key element in the XDN toolset that makes this possible is our &lt;a href="https://www.red5pro.com/blog/what-is-cauldron-what-are-brews/" rel="noopener noreferrer"&gt;Cauldron transcoding engine&lt;/a&gt;, which supports blending of personally directed stream segments with the primary streams in real time. These segments, which we call Brews, can be added in the transcoding process without incurring delays usually caused by the need to translate coding to the languages understood by processors. &lt;/p&gt;

&lt;p&gt;As for instances involving XR headgear, these and other unique attributes together with the simultaneous &lt;a href="https://www.red5.net/blog/what-is-ultra-low-latency-why-does-it-matter/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;ultra-low latency&lt;/a&gt; connecting all participants in AR and VR use cases running on XDN Architecture ensure that TrueTime Meetings™ is readily available to support any socialized XR use case. That means advancements like spatial audio, light detection and arranging (LiDAR) used to add realistic dimensionality to visual displays, and any iteration of spatial computing technology can be effortlessly brought to bear as our customers adapt to next-gen operations in the new world of real-time interactive video streaming. &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;With flexibility and scalability in interactive video communications as enabled by our TrueTime Meetings™ taking hold over the next few years, the market for interactive video communications across all consumer and non-consumer market segments is likely to take up a much bigger, if not the lion’s share, of the $107.0-billion interactive streaming market &lt;a href="https://www.grandviewresearch.com/industry-analysis/interactive-streaming-market-report" rel="noopener noreferrer"&gt;Grand View Research projects&lt;/a&gt; will be in play by 2030. Leveraging the readily deployable capabilities facilitating implementations of the open-source TrueTime Meetings™ tools through Red5 Cloud or Red5 Pro, operators of live-streamed use cases can act now to bring the new era in video communications to life in their domains.&lt;/p&gt;

&lt;h2&gt;
  
  
  Try Red5 TrueTime Meetings™ today
&lt;/h2&gt;

&lt;p&gt;Option 1: &lt;/p&gt;

&lt;p&gt;Download the source from GitHub&lt;/p&gt;

&lt;p&gt;Get the open source code, customize the front end, and deploy in your own environment. This is the best path if you want full control and deeper product integration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/red5pro/red5-truetime-meetings" rel="noopener noreferrer"&gt;View on GitHub &amp;gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Option 2:&lt;/p&gt;

&lt;p&gt;Launch it in Red5 Cloud&lt;/p&gt;

&lt;p&gt;Get the open source code, customize the front end, and deploy in your own environment. This is the best path if you want full control and deeper product integration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.red5.net/signup?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;Sign Up for Red5 Cloud &amp;gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Option 3:&lt;/p&gt;

&lt;p&gt;Deploy with Red5 Pro&lt;/p&gt;

&lt;p&gt;Prefer a self-managed setup? Red5 Pro users can download the source and deploy TrueTime Meetings™ in their own infrastructure, then extend it for their application and workflow needs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.red5.net/contact/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=TrueTime%20Meetings%E2%84%A2%3A%20Video%20Calling%20Built%20for%20Real-Time%20Streaming" rel="noopener noreferrer"&gt;Schedule a consultation &amp;gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>livestreaming</category>
      <category>software</category>
      <category>opensource</category>
    </item>
    <item>
      <title>SRT vs MOQT: Low-Latency Video Transport Comparison</title>
      <dc:creator>Maria Artamonova</dc:creator>
      <pubDate>Tue, 10 Mar 2026 13:31:31 +0000</pubDate>
      <link>https://dev.to/red5/srt-vs-moqt-low-latency-video-transport-comparison-9fp</link>
      <guid>https://dev.to/red5/srt-vs-moqt-low-latency-video-transport-comparison-9fp</guid>
      <description>&lt;p&gt;Questions about SRT vs MOQT come up often when engineers evaluate low latency video transport options. As a Lead Real-Time Video Architect working on MOQ development at Red5, I ran into this question during architecture discussions and realized others will likely face the same comparison soon. This article is written from my perspective and has also been reviewed and verified by my teammate &lt;a href="https://www.red5.net/author/paul-gregoire/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SRT%20vs%20MOQT%3A%20Low-Latency%20Video%20Transport%20Comparison" rel="noopener noreferrer"&gt;Paul Gregoire&lt;/a&gt;, Red5 Solutions Architect.&lt;/p&gt;

&lt;p&gt;If you want a quick summary, read the Key Takeaways section. If you want a deeper technical comparison, continue through the rest of the blog.&lt;/p&gt;

&lt;h2&gt;
  
  
  Key Takeaways
&lt;/h2&gt;

&lt;p&gt;SRT is a well-established, reliable protocol for video contribution, offering strict end-to-end latency controls. However, its payload-agnostic approach to packet dropping can introduce playback instability under severe congestion, and its single-stream architecture can still create Head-of-Line blocking conditions, limiting its applicability to modern SVC workflows.&lt;/p&gt;

&lt;p&gt;MOQT, pairing flexible streaming formats mapped to independent QUIC streams, provides equivalent latency controls while enabling granular, payload-aware data handling and discard strategies. Utilizing parallel streams, isolated packet loss recovery, and priority-based delivery, it can safely drop late data and natively support SVC adaptation. The protocol’s architecture is highly optimized for resilient, low-latency, bandwidth efficient media distribution.&lt;/p&gt;

&lt;h2&gt;
  
  
  Baseline for Comparison
&lt;/h2&gt;

&lt;p&gt;When comparing Secure Reliable Transport (&lt;a href="https://www.red5.net/srt-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SRT%20vs%20MOQT%3A%20Low-Latency%20Video%20Transport%20Comparison" rel="noopener noreferrer"&gt;SRT&lt;/a&gt;) and Media over QUIC Transport (&lt;a href="https://www.red5.net/media-over-quic-moq/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SRT%20vs%20MOQT%3A%20Low-Latency%20Video%20Transport%20Comparison" rel="noopener noreferrer"&gt;MOQT&lt;/a&gt;), it is important to establish equivalent architectural layers. Technically, SRT is a payload-agnostic transport protocol. However, in standard broadcast and streaming workflows, it is predominantly used to carry multiplexed MPEG-TS (Transport Stream) payloads. &lt;/p&gt;

&lt;p&gt;MOQT is an end-to-end media transport protocol designed to operate in conjunction with a wide range of application-layer streaming formats (current drafts define &lt;a href="https://datatracker.ietf.org/doc/draft-ietf-moq-msf/" rel="noopener noreferrer"&gt;MOQT Streaming Format (MSF)&lt;/a&gt; and &lt;a href="https://datatracker.ietf.org/doc/draft-ietf-moq-cmsf/" rel="noopener noreferrer"&gt;CMAF MSF (CMSF)&lt;/a&gt;. The streaming format and related container structure provides necessary media meta-data including timestamps. Therefore, a functional comparison for video delivery is best framed as &lt;strong&gt;SRT + MPEG-TS&lt;/strong&gt; versus &lt;strong&gt;MOQT + CMSF&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; it is important to mention that MOQT occupies different use-case spaces, with overlap primarily on the contribution side – SRT is generally not considered for distribution at scale to end-consumers.&lt;/p&gt;

&lt;h2&gt;
  
  
  Packet Scheduling and Multiplexing
&lt;/h2&gt;

&lt;p&gt;Under network congestion, transport protocols must manage how data is queued and transmitted through restricted bandwidth.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SRT Scheduling:&lt;/strong&gt; SRT processes packets in a primarily First-In, First-Out (FIFO) sequence over a single UDP connection. Because it does not natively inspect the payload, it cannot differentiate between critical media (like base video layers or audio) and supplemental media (like video enhancement layers) without custom application layer multiplexing.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MOQT Scheduling:&lt;/strong&gt; MOQT incorporates a prioritization model utilizing various priority parameters per stream. This allows the sender and intermediate relays to identify the relative importance of different media components. Under bandwidth constraints, an MOQT relay can use this logic to selectively delay or drop lower-priority streams to ensure the timely delivery of higher-priority streams.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Packet Loss Recovery and Head-of-Line Blocking
&lt;/h2&gt;

&lt;p&gt;Both SRT and MOQT (via QUIC) use Automatic Repeat reQuest (ARQ) to recover lost packets. When a packet is dropped by the network, the receiver asks the sender to retransmit it. The operational difference lies in how this recovery impacts the rest of the data in transit.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SRT (Head-of-Line Blocking):&lt;/strong&gt; SRT transmits its payload sequentially over a single connection. If a packet is lost, the receiver must hold all subsequent packets in a buffer until the lost packet is retransmitted and successfully arrives. However, if the retransmission delay exceeds the set latency buffer, SRT drops the packet entirely, allowing the stream to continue rather than waiting forever. This creates Head-of-Line (HoL) Blocking. Because all media (audio, video base layer, video enhancement layer) shares this single pipeline, a single lost network packet stalls the delivery of the entire transport stream, increasing latency across the board. Again this only “stalls” within the latency period, not forever.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MOQT (Independent Stream Recovery):&lt;/strong&gt; MOQT relies on QUIC’s multiplexed stream architecture. Because different media components (e.g., audio and video) are mapped to independent QUIC streams, packet loss is isolated. If a packet containing video data is lost, only the specific video stream waits for a retransmission. The audio stream, operating on a parallel QUIC stream, continues to deliver data to the application without interruption. This prevents a single network packet loss from stalling the entire media presentation.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Handling Late Data under Congestion
&lt;/h2&gt;

&lt;p&gt;When network delays cause data to arrive past its intended playback deadline, the two &lt;a href="https://www.red5.net/video-streaming-technology/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SRT%20vs%20MOQT%3A%20Low-Latency%20Video%20Transport%20Comparison" rel="noopener noreferrer"&gt;protocols&lt;/a&gt; use different mechanisms to discard that data.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SRT (Network-Level Dropping):&lt;/strong&gt; SRT utilizes a configured latency buffer. If a packet cannot be delivered within this timeframe, SRT’s Too-Late Packet Drop (TLPKTDROP) mechanism discards it. Because SRT drops data at the network packet level without payload awareness, this can result in the delivery of partial media frames. In an MPEG-TS workflow, this fragmentation can lead to decoder errors or visual artifacts, potentially persisting until the next keyframe.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MOQT (Application-Aware Dropping):&lt;/strong&gt; MOQT relies on a feedback loop between the application’s Streaming Format and the QUIC transport layer. The application layer evaluates the Presentation Timestamp (PTS); if a frame exceeds its playback deadline, it instructs the transport layer to issue a QUIC STOP_SENDING frame. MOQT then discards the complete, semantic media unit via a RESET_STREAM operation. This preserves the structural integrity of the remaining video streams and avoids corrupting the decoder.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Support for Video Adaptation (ABR and SVC)
&lt;/h2&gt;

&lt;p&gt;Modern video delivery relies on Adaptive Bitrate (ABR) and Scalable Video Coding (SVC) to adjust to changing network conditions.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SRT and SVC:&lt;/strong&gt; Because SRT typically carries a single, multiplexed MPEG-TS stream, all SVC layers (base resolution and enhancement details) share the same transport queue. If the network drops a base layer packet while successfully delivering an enhancement layer packet, the enhancement data cannot be decoded, limiting the practical effectiveness of SVC over a standard SRT link.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MOQT and SVC/ABR Integration:&lt;/strong&gt; MOQT maps SVC layers to independent QUIC streams (Subgroups), facilitating two types of adaptation:

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Layer Dropping (SVC):&lt;/strong&gt; During transient network drops, MOQT relays autonomously discard low-priority enhancement Subgroups. The player experiences a temporary reduction in quality while maintaining uninterrupted playback.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Track Switching (ABR):&lt;/strong&gt; For sustained changes in network capacity, the client can issue a SUBSCRIBE command for a lower-bitrate MOQT Track. MOQT processes these switches at defined Group boundaries (Keyframes), providing clean transitions between quality tiers.&lt;/li&gt;
&lt;/ol&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Integration with Transcoding Workflows
&lt;/h2&gt;

&lt;p&gt;The integration of these protocols into &lt;a href="https://www.red5.net/blog/what-is-transcoding/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SRT%20vs%20MOQT%3A%20Low-Latency%20Video%20Transport%20Comparison" rel="noopener noreferrer"&gt;transcoding&lt;/a&gt; pipelines involves a tradeoff between current ecosystem support and architectural efficiency.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SRT Ecosystem Maturity:&lt;/strong&gt; SRT combined with MPEG-TS is a mature, widely adopted standard. It possesses extensive support across legacy hardware encoders, software transcoders (e.g., FFmpeg), and existing cloud broadcast infrastructure.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MOQT Processing Efficiency:&lt;/strong&gt; SRT’s monolithic payload requires transcoders to ingest, demultiplex, and decode the entire Transport Stream before processing. By contrast, MOQT’s architecture separates media into independent Tracks and Subgroups. This allows a modern transcoder to selectively ingest only the required streams (e.g., processing a 1080p base layer while actively ignoring higher-resolution streams), offering a more compute-efficient pipeline as the software ecosystem matures.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In summary, the SRT vs MOQT comparison highlights the difference between a mature contribution protocol built around a single transport stream and a newer architecture designed for multiplexed, media aware delivery. SRT remains widely used and reliable, while MOQT introduces transport level capabilities that align better with modern scalable video workflows and adaptive streaming models.&lt;/p&gt;

&lt;p&gt;If you want to explore how MOQ compares with other real time delivery approaches, read our related blog on &lt;a href="https://www.red5.net/blog/moq-vs-webrtc/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=SRT%20vs%20MOQT%3A%20Low-Latency%20Video%20Transport%20Comparison" rel="noopener noreferrer"&gt;MOQ vs WebRTC&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>livestreaming</category>
      <category>software</category>
      <category>srt</category>
      <category>moq</category>
    </item>
    <item>
      <title>Why The Term “Smart City Surveillance” Needs An Update In The Era Of AI And 5G</title>
      <dc:creator>Maria Artamonova</dc:creator>
      <pubDate>Wed, 04 Mar 2026 14:19:56 +0000</pubDate>
      <link>https://dev.to/red5/why-the-term-smart-city-surveillance-needs-an-update-in-the-era-of-ai-and-5g-1241</link>
      <guid>https://dev.to/red5/why-the-term-smart-city-surveillance-needs-an-update-in-the-era-of-ai-and-5g-1241</guid>
      <description>&lt;p&gt;Smart city surveillance is rapidly evolving beyond traditional monitoring, thanks to real-time streaming technologies, AI, and 5G. In this blog, based on &lt;a href="https://www.linkedin.com/posts/thechrisallen%5Fchrisallentalks-red5-livestreaming-activity-7350917602220625921-0Dvl/" rel="noopener noreferrer"&gt;my recent LinkedIn post&lt;/a&gt;, you’ll learn the definition of smart cities and how the concept is evolving in 2026, gain an overview of the market, see how AI is revolutionizing this space, and discover what technology capabilities are now available to transportation agencies, public safety departments, disaster response teams, and other groups using smart city solutions.&lt;/p&gt;

&lt;h2&gt;
  
  
  What Is A Smart City?
&lt;/h2&gt;

&lt;p&gt;A smart city is an urban area that uses smart technologies like ICT, IoT, and &lt;a href="https://www.red5.net/solutions/smart-city-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Why%20The%20Term%20%E2%80%9CSmart%20City%20Surveillance%E2%80%9D%20Needs%20An%20Update%20In%20The%20Era%20Of%20AI%20And%205G" rel="noopener noreferrer"&gt;real-time streaming&lt;/a&gt; to enhance services and improve quality of life. With connected cameras, sensors, and software, cities can monitor feeds, detect incidents, and balance safety with privacy. These technologies integrate data-driven decision-making into everyday operations.&lt;/p&gt;

&lt;h3&gt;
  
  
  Smart City Market Overview
&lt;/h3&gt;

&lt;p&gt;According to &lt;a href="https://www.statista.com/outlook/tmo/internet-of-things/smart-cities/united-states" rel="noopener noreferrer"&gt;Statista&lt;/a&gt;, the United States is at the forefront of smart city development, with cities like New York and San Francisco leading the way in implementing advanced technologies for efficient infrastructure and improved quality of life. In terms of global comparison, the United States is projected to generate the highest revenue, with US$27.06 billion expected in 2025. It is anticipated that the revenue will demonstrate an annual growth rate (CAGR 2025–2029) of 7.27 percent. This growth will lead to a market volume of US$35.84 billion by 2029.&lt;/p&gt;

&lt;p&gt;The U.S. government is also putting real momentum behind smart city innovation. Congress passed the &lt;a href="https://www.congress.gov/bill/118th-congress/house-bill/9892/text" rel="noopener noreferrer"&gt;Smart Cities and Communities Act of 2024&lt;/a&gt; and followed with the &lt;a href="https://www.congress.gov/bill/119th-congress/house-bill/4649/text" rel="noopener noreferrer"&gt;2025 version&lt;/a&gt;, both designed to fund projects, create standards, and boost cybersecurity. The Department of Transportation is running the &lt;a href="https://www.transportation.gov/grants/SMART" rel="noopener noreferrer"&gt;SMART Grants Program&lt;/a&gt;, channeling resources into connected transportation and safety. On top of that, agencies like &lt;a href="https://www.nist.gov/news-events/news/2024/01/nist-requesting-public-input-published-strategic-plan-smart-cities-program" rel="noopener noreferrer"&gt;NIST&lt;/a&gt; and &lt;a href="https://www.nsf.gov/cise/smart-connected-communities" rel="noopener noreferrer"&gt;NSF&lt;/a&gt; are pushing research that takes these ideas well beyond big cities and into the communities where people actually live and work.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Future Of Smart Cities
&lt;/h2&gt;

&lt;p&gt;I’ve been thinking about that a lot lately as we’ve worked on projects that go well beyond the traditional “smart city” pitch deck.&lt;/p&gt;

&lt;p&gt;With modern AI and real-time video infrastructure combined with growing 5G connectivity, we’re no longer limited to passive surveillance or slow post-event analysis. Today, systems can detect critical conditions as they unfold, so humans can intervene faster, or in some cases, not intervene at all.&lt;/p&gt;

&lt;p&gt;One problem I have with the term “smart cities” is that so many of the use cases have nothing to do with cities at all. We’re not just talking about &lt;a href="https://www.red5.net/solutions/streaming-for-video-surveillance-and-public-safety/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Why%20The%20Term%20%E2%80%9CSmart%20City%20Surveillance%E2%80%9D%20Needs%20An%20Update%20In%20The%20Era%20Of%20AI%20And%205G" rel="noopener noreferrer"&gt;big city surveillance&lt;/a&gt;. These “smart city” systems are already being used by regional &lt;a href="https://www.red5.net/solutions/video-streaming-for-traffic-monitoring/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Why%20The%20Term%20%E2%80%9CSmart%20City%20Surveillance%E2%80%9D%20Needs%20An%20Update%20In%20The%20Era%20Of%20AI%20And%205G" rel="noopener noreferrer"&gt;transportation agencies&lt;/a&gt;, &lt;a href="https://www.red5.net/solutions/drone-public-safety/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Why%20The%20Term%20%E2%80%9CSmart%20City%20Surveillance%E2%80%9D%20Needs%20An%20Update%20In%20The%20Era%20Of%20AI%20And%205G" rel="noopener noreferrer"&gt;public safety departments&lt;/a&gt;, and disaster response teams which cover rural areas too. Smart Cities was probably the right term when it was created, but times have changed. With today’s low-cost cameras, 5G connectivity, and cloud-hybrid infrastructure, it’s possible to put intelligence at the edge everywhere.&lt;/p&gt;

&lt;h2&gt;
  
  
  How Does AI Improve Smart City Surveillance?
&lt;/h2&gt;

&lt;p&gt;In these new deployments, &lt;a href="https://www.red5.net/blog/how-ai-based-video-analytics-enhances-video-surveillance/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Why%20The%20Term%20%E2%80%9CSmart%20City%20Surveillance%E2%80%9D%20Needs%20An%20Update%20In%20The%20Era%20Of%20AI%20And%205G" rel="noopener noreferrer"&gt;AI-based video analytics&lt;/a&gt; does a lot of the heavy lifting in video surveillance. The job of the operator becomes less about staring at feeds and more about responding to actionable insight. For example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Cameras detect crowd congestion in a stadium and reroute foot traffic.&lt;/li&gt;
&lt;li&gt;A traffic feed flags a stopped vehicle inside a tunnel and alerts rescue teams in real time.&lt;/li&gt;
&lt;li&gt;Edge-based systems detect early signs of wildfires or floods and trigger rapid response, potentially saving lives.&lt;/li&gt;
&lt;li&gt;Teams can even record and share key footage as on-demand video with law enforcement to aid in investigations.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I talked about &lt;a href="https://rtcon.live/speakers/chris-allen-wo9gd" rel="noopener noreferrer"&gt;at the RTC.On&lt;/a&gt; conference in Poland in September 2025. &lt;a href="https://rtcon.live/speakers/chris-allen-wo9gd" rel="noopener noreferrer"&gt;My talk&lt;/a&gt;, “The Future in Focus: AI and the Next Wave of Real-Time Video Intelligence”, explored large language models (LLMs) and &lt;a href="https://www.red5.net/docs/red5-pro/development/api/mixer/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Why%20The%20Term%20%E2%80%9CSmart%20City%20Surveillance%E2%80%9D%20Needs%20An%20Update%20In%20The%20Era%20Of%20AI%20And%205G" rel="noopener noreferrer"&gt;Red5 Pro’s Brew API&lt;/a&gt; can be used to detect everything from crowd control issues and forest fires to firearms and inappropriate content in live user streams. I also walked through real-world architectures and what’s coming next in this space. If you’re interested to know more, watch this recording on Youtube:&lt;/p&gt;

&lt;h2&gt;
  
  
  Cameras Everywhere? Finding The Right Balance
&lt;/h2&gt;

&lt;p&gt;“What about big bother?” you might ask. “Should we have cameras like this everywhere?” This is honestly a difficult question. There’s a saying that with every new technology it can be used for good and for evil. Plus, it’s pretty clear that the cat is out of the bag, and we aren’t going back. I prefer to embrace the possibilities and do what we can to prevent the bad stuff while enabling life saving use cases that make a difference. Cameras everywhere can actually be a good thing if we use them with the right intent, the right architecture, and the right safeguards.&lt;/p&gt;

&lt;p&gt;At Red5, we’ve built &lt;a href="https://www.red5.net/solutions/smart-city-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Why%20The%20Term%20%E2%80%9CSmart%20City%20Surveillance%E2%80%9D%20Needs%20An%20Update%20In%20The%20Era%20Of%20AI%20And%205G" rel="noopener noreferrer"&gt;streaming infrastructure for smart cities&lt;/a&gt; that supports all of this:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Ultra-Low Latency Streaming.&lt;/strong&gt; Our sub-250 ms latency and ability to ingest video at the edge, feed it directly into GPU-powered AI models, and share results instantly, whether live or as on-demand content.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cross-Department Sharing&lt;/strong&gt;. Seamlessly share video feeds across traffic management, emergency services, and public safety departments while maintaining secure access controls.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scalable City-Wide Deployment&lt;/strong&gt;. Support thousands of cameras across urban environments while maintaining sub-250ms latency for real-time monitoring and response.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Automated Alert System&lt;/strong&gt;. AI-powered detection automatically alerts relevant departments to incidents, from road hazards to emergency vehicle deployment.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Flexible Infrastructure Integration&lt;/strong&gt;. Integrate with existing urban infrastructure and camera systems while enabling future expansion and technology adoption.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Smart cities are no longer just about urban infrastructure. With AI, 5G, ultra-low latency streaming, and edge-powered analytics, surveillance systems can now detect issues as they unfold and support faster, safer decision making. &lt;a href="https://www.linkedin.com/in/thechrisallen/" rel="noopener noreferrer"&gt;Follow me on LinkedIn&lt;/a&gt; to keep up with my latest #ChrisAllenTalks posts and videos. Learn more about the &lt;a href="https://www.red5.net/blog/ai-in-live-streaming/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=Why%20The%20Term%20%E2%80%9CSmart%20City%20Surveillance%E2%80%9D%20Needs%20An%20Update%20In%20The%20Era%20Of%20AI%20And%205G" rel="noopener noreferrer"&gt;AI technology is changing live streaming&lt;/a&gt; in our next blog. &lt;/p&gt;

</description>
      <category>livestreaming</category>
      <category>software</category>
      <category>learning</category>
      <category>beginners</category>
    </item>
    <item>
      <title>The Future of News Broadcasting Is Real-Time, Decentralized, and Participatory</title>
      <dc:creator>Maria Artamonova</dc:creator>
      <pubDate>Mon, 02 Mar 2026 14:37:49 +0000</pubDate>
      <link>https://dev.to/red5/the-future-of-news-broadcasting-is-real-time-decentralized-and-participatory-1ip2</link>
      <guid>https://dev.to/red5/the-future-of-news-broadcasting-is-real-time-decentralized-and-participatory-1ip2</guid>
      <description>&lt;p&gt;In this blog, based on &lt;a href="https://www.linkedin.com/feed/update/urn:li:ugcPost:7356014338698027008/" rel="noopener noreferrer"&gt;my recent LinkedIn post&lt;/a&gt;, you’ll learn how the future of news broadcasting is being shaped by real-time video in a world of misinformation, how decentralized and participatory models are redefining coverage, what streaming technologies leading newsrooms are adopting, and how these shifts set new standards for speed, authenticity, and audience engagement.  &lt;/p&gt;

&lt;p&gt;Through &lt;a href="https://www.red5.net/case-studies/red5-cloud-zixi-real-time-monitoring-and-streaming-solution-for-live-event-production/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=The%20Future%20of%20News%20Broadcasting%20Is%20Real-Time%2C%20Decentralized%2C%20and%20Participatory" rel="noopener noreferrer"&gt;our integration&lt;/a&gt;, we can connect directly to those existing streams. This makes it easy to add real-time multi-view layouts and give teams instant access to live feeds, all without requiring a complete overhaul of their current workflow.&lt;/p&gt;

&lt;p&gt;In major newsrooms, producers are using Red5’s &lt;a href="https://www.red5.net/truetime/multiview-for-production/?utm_campaign=19381420-Blog%20repost&amp;amp;utm_source=Referral&amp;amp;utm_medium=Dev.to&amp;amp;utm_term=The%20Future%20of%20News%20Broadcasting%20Is%20Real-Time%2C%20Decentralized%2C%20and%20Participatory" rel="noopener noreferrer"&gt;TrueTime MultiView&lt;/a&gt;™ to build real-time monitoring walls. These setups help teams track multiple breaking stories at once while keeping every feed in perfect sync. It’s a game changer for fast-paced editorial decision-making.&lt;/p&gt;

&lt;p&gt;Field reporting has gotten a boost too. With Red5’s adaptive bitrate streaming, news organizations are keeping clean, stable connections with on-site reporters, even in remote areas or on unreliable networks. Reporters stay connected, editors stay in control, and the audience gets coverage that feels truly live.&lt;/p&gt;

&lt;p&gt;It’s been rewarding to watch these teams move beyond patchwork tools and start building systems that actually support the speed and complexity of today’s news cycle.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;If you’re a product lead or technology decision-maker inside a traditional broadcast news organization, this is your moment. Real-time infrastructure and participatory models are no longer fringe experiments. They are quickly becoming the standard. Audiences expect immediacy, interactivity, and authenticity. If your stack still relies on delayed workflows and closed-loop production, you’re going to lose relevance fast.&lt;/p&gt;

&lt;p&gt;To stay competitive and credible, it’s time to embrace the shift. Real-time video, real audience participation, and real agility in your newsrooms. The future won’t wait.&lt;/p&gt;

</description>
      <category>livestreaming</category>
      <category>software</category>
      <category>learning</category>
      <category>beginners</category>
    </item>
  </channel>
</rss>
