DEV Community

Jonas Birmé for Eyevinn Video Dev-Team Blog

Posted on

How to setup a lab environment for WebRTC broadcast streaming

In this blog post we are going to look at our lab environment for WebRTC based broadcast streaming and how you can setup one of your own. In this scenario we are talking about one-to-many streaming based on WebRTC standards and the HTTP-based signaling protocol WHIP (WebRTC HTTP Ingest Protocol) and WHPP (WebRTC HTTP Playback Protocol).

The setup is based on open-source tools and components we have developed when building a proof-of-concept on how WebRTC based broadcast streaming pipeline can be divided into ingest, distribution and playback with standardized interfaces between. The setup is built of using these tools and components:

Our current lab setup consists of two pipelines: One using an internal NodeJs WebRTC server based on libwrtc bindings for NodeJs (A), and the other one where an external WebRTC media server is used (B).

Eyevinn WebRTC lab setup

The source is a cheap camera producing an RTSP video stream that ffmpeg repackage into two MPEG-TS stream that are pushed over unicast UDP to two separate WHIP clients. One WHIP client is pushing video to the pipeline A and the other WHIP client is pushing to pipeline B.

Pipeline A (internal WebRTC server)

In this pipeline the SRTP streams transporting media is transported through the WHIP and WHPP endpoint services built on NodeJs and libwrtc bindings. Not the most performant solution but enabled rapid prototyping and as a good start for the proof-of-concept. Also, it is not an SFU so media is processed for each connected viewer. A setup that will not scale for a production-like setup. For that we have built pipeline B.

Pipeline B (external WebRTC server)

Here we are using an external WebRTC media server and SFU that handles the SRTP streams and the actual media transport. The WHIP and WHPP endpoint services only handles the HTTP based signaling protocol to establish the ingest connection and the playback connection for each viewer. Currently the WHIP and WHPP endpoint services are tightly coupled together (with code) but we are working on splitting these up into two services with an HTTP based interface between. The interface to control the WebRTC media server is in this case HTTP based. While the ingest and playback interface aims to be standardized the communication between the services and the media server can be specific to the implementation.

Shared by both pipelines is an NGINX server that handles the HTTPS termination and routing to respective pipeline’s WHPP endpoints. This also allows the WHPP endpoints to be horizontally scaled for both redundancy and load. However, in the current setup this has not yet been done.

Setup your own pipeline

Now we have described our current lab setup, so let us now jump into how you can use our tools and libraries to setup one of your own. In this tutorial we will describe the necessary steps for a pipeline using an external WebRTC media server.

WHIP service, WHPP service and SFU

Clone and checkout the GitHub project containing a WHIP and WHPP services.

$ git clone git@github.com:Eyevinn/whip.git
Enter fullscreen mode Exit fullscreen mode

Build the WHIP and WHPP services:

$ cd whip/
$ npm install
$ npm run build
Enter fullscreen mode Exit fullscreen mode

Build and run the SFU

$ cd tools/
$ ./start_local_sfu.sh
Enter fullscreen mode Exit fullscreen mode

Then start the WHIP and WHPP endpoint services:

$ BROADCAST_HOSTNAME=<external-whpp-endpoint> \
  BROADCAST_USE_HTTPS=true \
  BROADCAST_EXT_PORT=443 \
  WHIP_ENDPOINT_HOSTNAME=<external-whip-endpoint> \
  WHIP_ENDPOINT_USE_HTTPS=true \
  EXT_PORT=443 \
  USE_SFU=true \
  npm start
Enter fullscreen mode Exit fullscreen mode

Then configure and run the NGINX server to route upstream to port 8000 for the WHIP service and 8001 for the WHPP (broadcast) service. For example:

upstream whip {
  server 127.0.0.1:8000;
}

upstream whpp {
  server 127.0.0.1:8001;
}

map $http_host $name {
  <external-whip-endpoint>:443 "whip";
  <external-whpp-endpoint>:443 "whpp";
}

server {
  listen 443 ssl;
  ssl_certificate <path/to/fullchain.pem>;
  ssl_certificate_key <path/to/privkey.pem>;

  location / {
    proxy_pass http://$name;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_redirect off;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header Host $http_host;
    proxy_redirect off;
    proxy_set_header X-Forwarded-Proto $scheme;
    proxy_set_header X-NginX-Proxy true;
    proxy_connect_timeout 600;
    proxy_send_timeout 600;
    proxy_read_timeout 600;
    send_timeout 600;
  }
}
Enter fullscreen mode Exit fullscreen mode

WHIP client

Clone and checkout the GitHub project containing our GStreamer-based WHIP client.

$ git clone git@github.com:Eyevinn/whip-mpegts.git
Enter fullscreen mode Exit fullscreen mode

Build the tool according to the instructions in the README file and then start the tool with the following command:

$ ./mpeg-ts-client -a 127.0.0.1 -p 9998 -u http://127.0.0.1:8000/api/v1/whip/sfu-broadcaster?channelId=<channelId>
Enter fullscreen mode Exit fullscreen mode

To push an MPEG-TS stream to this client you send it over UDP unicast to port 9998.

WHPP player

To playback the stream you can use the WHPP compatible WebRTC player available at webrtc.player.eyevinn.technology or any other WHPP capable player out there. The address to your stream would be: https://<external-whpp-endpoint>/broadcaster/channel/<channelId>

And that's it! Now you have your own WebRTC based pipeline for broadcast streaming in your lab. If you wish to add support for another WebRTC media server we are super-happy for a contribution or if you want us to do it just drop an email to sales@eyevinn.se.

Regarding WHEP vs WHPP: When we started looking into dividing a WebRTC based broadcast streaming (1 to many) pipeline into ingest, distribution and playback we quickly identified that there was no standard protocol on the playback side available. To fill this gap, we developed and proposed a standard we called WebRTC HTTP Playback Protocol (WHPP). We made it available for comments this spring and presented it at Streaming Tech Sweden 2022 in the beginning of June.

As some of you might have seen a standard called WebRTC HTTP Egress Protocol (WHEP) was drafted and made available in the beginning of August, and that confirms our belief that there is a need to fill this gap.

From Eyevinn's perspective the name of this standard is not important, and we have provided the WHEP authors with our feedback based on our experiences from our proof-of-concepts and WHPP. Whether it will be on WHEP or WHPP, we will continue our work with our proof-of-concepts and contributing back to the community with knowledge and open-source tools in this area.

Top comments (0)