<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Janos Tolgyesi</title>
    <description>The latest articles on DEV Community by Janos Tolgyesi (@mrtj).</description>
    <link>https://dev.to/mrtj</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/mrtj"/>
    <language>en</language>
    <item>
      <title>Remote View your Computer Vision Models Running on AWS Panorama</title>
      <dc:creator>Janos Tolgyesi</dc:creator>
      <pubDate>Mon, 09 May 2022 15:41:22 +0000</pubDate>
      <link>https://dev.to/mrtj/remote-view-your-computer-vision-models-running-on-aws-panorama-2imp</link>
      <guid>https://dev.to/mrtj/remote-view-your-computer-vision-models-running-on-aws-panorama-2imp</guid>
      <description>&lt;p&gt;Real-time smart video analytics application development and edge device deployment is a tricks task. In the last few years, the industry players built platforms to support this activity. Notable examples include &lt;a href="https://www.nvidia.com/en-us/autonomous-machines/intelligent-video-analytics-platform/"&gt;NVIDIA Metropolis&lt;/a&gt;, &lt;a href="https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/overview.html"&gt;Intel’s OpenVINO&lt;/a&gt;, and &lt;a href="https://aws.amazon.com/panorama/"&gt;AWS Panorama&lt;/a&gt;. While these solutions make some aspects of the video-analytics application development more straightforward, there are still many issues to deal with before deploying a video analytics application in production. This post introduces Telescope, the first in a &lt;a href="https://github.com/Neosperience/backpack"&gt;series of open-source tools&lt;/a&gt; to make developing AWS Panorama application simpler.&lt;/p&gt;

&lt;p&gt;AWS Panorama is a machine learning appliance and software framework that allows you to deploy video analytics applications on edge. For a thorough introduction and a step-by-step tutorial on deploying a Panorama application, refer to &lt;em&gt;&lt;a href="https://towardsdatascience.com/deploy-an-object-detector-model-at-the-edge-on-aws-panorama-9b80ea1dd03a"&gt;Deploy an Object-Detector Model at the Edge on AWS Panorama&lt;/a&gt;&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;AWS Panorama framework eases many aspects of video analytics application development, including camera connection management, video decoding, frame extraction, model optimization and loading, display output management, over-the-air deployment of your application, and other features. Nevertheless, some tasks are still challenging, including diagnostic tasks when a model does not work as expected.&lt;/p&gt;

&lt;h2&gt;
  
  
  Introducing Telescope
&lt;/h2&gt;

&lt;p&gt;The only available way to get visual feedback on the correct functionality of a Panorama application is to physically connect a display to the HDMI port of the appliance. The display will show the output video stream of a single application deployed on the device. However, physically accessing the appliance is not always feasible. Telescope allows to re-stream the output video of any Panorama application to an external service, for example, to &lt;a href="https://docs.aws.amazon.com/kinesisvideostreams/latest/dg/what-is-kinesis-video.html"&gt;AWS Kinesis Video Streams&lt;/a&gt;. This feature can be very convenient for monitoring an application remotely.&lt;/p&gt;

&lt;h2&gt;
  
  
  How does Telescope work?
&lt;/h2&gt;

&lt;p&gt;Telescope instantiates a &lt;a href="https://gstreamer.freedesktop.org/documentation/application-development/introduction/basics.html"&gt;GStreamer pipeline&lt;/a&gt; with an &lt;a href="https://gstreamer.freedesktop.org/documentation/app/appsrc.html"&gt;&lt;code&gt;appsrc&lt;/code&gt;&lt;/a&gt; element at the head. An &lt;a href="https://docs.opencv.org/4.5.5/dd/d43/tutorial_py_video_display.html"&gt;OpenCV&lt;/a&gt; &lt;a href="https://docs.opencv.org/4.5.5/dd/d43/tutorial_py_video_display.html"&gt;&lt;code&gt;VideoWriter&lt;/code&gt;&lt;/a&gt; is configured to write to the &lt;code&gt;appsrc&lt;/code&gt;: instead of saving the consecutive frames to a video file, it streams to the output sink. When opening the &lt;code&gt;VideoWriter&lt;/code&gt; instance, the user should specify the frame width and height and the frame rate of the output stream. You can manually set these parameters or let Telescope infer them from the input dimensions and the frequency you feed new frames to it. If using this auto-configuration feature, some frames (by default 100) will be discarded at the beginning of the streaming because Telescope will use them to calculate statistics of the frame rate and measure the frame dimensions. This is the “warmup” of Telescope. If you send frames with different sizes, Telescope will resize the input, but this comes with a performance penalty. You are also expected to send new frames to Telescope with the frequency specified in the frame-per-second parameter. If frames are sent at a different frequency, the video fragments get out of sync in Kinesis Video Streams, and you won’t be able to replay the video smoothly.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7JspUtcw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9ptokgyxahjd4syvwvly.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7JspUtcw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9ptokgyxahjd4syvwvly.jpeg" alt="Cogwheels of a complicated mechanism" width="880" height="586"&gt;&lt;/a&gt;Photo by &lt;a href="https://unsplash.com/@joshredd"&gt;Josh Redd&lt;/a&gt; on &lt;a href="https://unsplash.com/"&gt;Unsplash&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Telescope&lt;/code&gt; is an abstract base class that handles the state machine of the GStreamer pipeline. The concrete implementation KVSTelescope sends frames to &lt;a href="https://aws.amazon.com/kinesis/video-streams/"&gt;Amazon Kinesis Video Streams&lt;/a&gt; service. It is easy to extend Telescope to support other services, especially if there is already a GStreamer plugin for that.&lt;/p&gt;

&lt;h2&gt;
  
  
  Integrating Telescope into a Panorama application
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Configuring the Docker container
&lt;/h3&gt;

&lt;p&gt;Telescope depends on custom compiled external libraries. All these libraries must be compiled and configured correctly in your application’s docker container to make &lt;code&gt;Telescope&lt;/code&gt; work. These libraries include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;GStreamer 1.0 installed with standard plugins pack, libav, tools, and development libraries;&lt;/li&gt;
&lt;li&gt;OpenCV 4.2.0, compiled with GStreamer support and Python bindings;&lt;/li&gt;
&lt;li&gt;numpy (it is typically installed by the base docker image of your Panorama application).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In the case you want to use &lt;code&gt;KVSTelescope&lt;/code&gt;, the &lt;code&gt;Telescope&lt;/code&gt; implementation that streams the video to Amazon Kinesis Video Streams, it will need also the following libraries:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Amazon Kinesis Video Streams (KVS) Producer SDK compiled with GStreamer plugin support;&lt;/li&gt;
&lt;li&gt;Environment variable &lt;code&gt;GST_PLUGIN_PATH&lt;/code&gt; configured to point to the directory where the compiled binaries of KVS Producer SDK GStreamer plugin is placed;&lt;/li&gt;
&lt;li&gt;Environment variable &lt;code&gt;LD_LIBRARY_PATH&lt;/code&gt; including the third-party open-source dependencies compiled by KVS Producer SDK;&lt;/li&gt;
&lt;li&gt;boto3 (it is typically installed by the base docker image of your Panorama application).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In the source code repository of Telescope, a &lt;a href="https://github.com/Neosperience/backpack/blob/master/examples/Dockerfile"&gt;sample Dockerfile&lt;/a&gt; is provided in the examples folder that shows how to install these libraries and Telescope in any container correctly. In most cases, it is required just to copy the relevant sections from the sample to your application’s Dockerfile. Please note that the first time you compile the docker container, it might take up to one hour to correctly compile all libraries.&lt;/p&gt;

&lt;h3&gt;
  
  
  Setting up AWS IAM privileges for KVSTelescope
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;KVSTelescope&lt;/code&gt; uses the Amazon Kinesis Video Streams (KVS) Producer library, wrapped in a GStreamer sink element, to send the processed frames to the KVS service. KVS Producer needs AWS credentials. It can use the credentials associated with the Panorama Application Role, but you must explicitly configure it. &lt;code&gt;KVSTelescope&lt;/code&gt; needs privileges to execute the following actions:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;kinesisvideo:DescribeStream
kinesisvideo:GetStreamingEndpoint
kinesisvideo:PutMedia
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If KVSTelescope needs to automatically create the Kinesis Video Stream the first time it is used, you should also include the &lt;code&gt;kinesisvideo:CreateStream&lt;/code&gt; action. An example policy allowing KVSTelescope to write data to Kinesis Video Streams could look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Version"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2012-10-17"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Statement"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Effect"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Allow"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"kinesisvideo:DescribeStream"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"kinesisvideo:CreateStream"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"kinesisvideo:GetDataEndpoint"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"kinesisvideo:PutMedia"&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Resource"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"*"&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Setting up credentials
&lt;/h3&gt;

&lt;p&gt;There are two types of AWS security credentials: static and temporary. The former never expire and in case of a leak, you should invalidate them manually and reconfigure the applications. For this reason, their use is strongly discouraged in a production environment. Examples of static AWS credentials include the IAM user’s Access Key Id and Secret Access Key pair.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--JMUxQbTq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qrtceqvziyxy34tnreq6.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--JMUxQbTq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qrtceqvziyxy34tnreq6.jpeg" alt="A photo of a keychain" width="880" height="586"&gt;&lt;/a&gt;Photo by &lt;a href="https://unsplash.com/@mangelinka"&gt;Angela Merenkova&lt;/a&gt; on &lt;a href="https://unsplash.com/"&gt;Unsplash&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Temporary credentials expire after a predefined period. In case of a leak, they can be used only within their expiration time, typically in the order of several hours. Temporary credentials can be renewed before they expire so their lifespan can be extended, or they can be exchanged for new ones with a later expiration time. This process needs additional coordination from the application using this type of credentials. Examples of temporary credentials include the AWS Access Key Id, the Secret Access Key, and the Session Token.&lt;/p&gt;

&lt;p&gt;We offer different options to provide credentials with &lt;code&gt;KVSCredentialsHandler&lt;/code&gt; subclasses provided in the &lt;code&gt;kvs&lt;/code&gt; module. If you want to use static credentials for testing purposes, &lt;a href="https://docs.aws.amazon.com/IAM/latest/UserGuide/id_users_create.html"&gt;create an IAM user&lt;/a&gt; in your AWS account and &lt;a href="https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_manage-attach-detach.html"&gt;attach a policy&lt;/a&gt; like the one above to the user. You should configure this user to have programmatic access to AWS resources and get the user’s &lt;a href="https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html"&gt;AWS Access Key and Secret Key pair&lt;/a&gt;: create a &lt;code&gt;KVSInlineCredentialsHandler&lt;/code&gt; or &lt;code&gt;KVSEnvironmentCredentialsHandler&lt;/code&gt; instance in the application’s code to pass these credentials to KVS Producer Plugin directly in the GStreamer pipeline definition or as environment variables. However, as these credentials do not expire, using this configuration in a production environment is not recommended. Even in a development and testing environment, you should take the appropriate security measures to protect these credentials: never hardcode them in the source code. Instead, use AWS Secret Manager or a similar service to provide these parameters to your application.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;KVSTelescope&lt;/code&gt; can also use the &lt;a href="https://docs.aws.amazon.com/panorama/latest/dev/permissions-application.html"&gt;Panorama Application Role&lt;/a&gt; to pass the application's credentials to KVS Producer. These credentials are temporary, meaning that they expire within a couple of hours, and the user should renew them before expiration. The Producer library expects temporary credentials in a text file. &lt;code&gt;KVSFileCredentialsHandler&lt;/code&gt; manages the renewal of the credentials and periodically updates the text file with the new credentials. If you want to use this method, attach a policy similar to the example above to your Application Role. Remember to always test your Panorama application — KVS integration if it still works after the &lt;code&gt;KVSFileCredentialsHandler&lt;/code&gt; refreshed the credentials. Let your application run for several hours and periodically check if it continues to stream the video to KVS. You will also find diagnostic information in the CloudWatch logs of your application when the credentials were renewed.&lt;/p&gt;

&lt;h3&gt;
  
  
  Environment variables
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;KvsTelescope&lt;/code&gt; needs two environment variables to have GStreamer find the KVS Producer plugin. The name of these variables is &lt;code&gt;GST_PLUGIN_PATH&lt;/code&gt; and &lt;code&gt;LD_LIBRARY_PATH&lt;/code&gt;. They point to the folder of KVS Producer GStreamer plugin and its 3rd party dependencies. In the sample Dockerfile provided, the correct values of these variables are written to a small configuration file named &lt;code&gt;/panorama/.env&lt;/code&gt; in the container. You should pass the path of this file to &lt;code&gt;KvsTelescope&lt;/code&gt; or otherwise ensure that these environment variables contain the correct value.&lt;/p&gt;

&lt;h2&gt;
  
  
  Usage example
&lt;/h2&gt;

&lt;p&gt;When initializing a &lt;code&gt;KVSTelescope&lt;/code&gt; instance, you should pass the AWS region name where your stream is created, the name of the stream, and a credentials handler instance. If you want to configure the frame rate and the dimensions of the frames manually, you should also set them here. When both parameters are specified, Telescope skips the warmup period and sends the first frame directly to KVS. When you are ready to send frames, call the &lt;code&gt;start_streaming&lt;/code&gt; method which opens the GStreamer pipeline. After this method is called, you are expected to send new frames to the stream calling the &lt;code&gt;put&lt;/code&gt; method periodically, with the frequency specified in the frame rate or inferred by &lt;code&gt;KvsTelescope&lt;/code&gt;. You can stop and restart streaming on the same &lt;code&gt;KvsTelescope&lt;/code&gt; instance.&lt;/p&gt;

&lt;p&gt;The following example uses the temporary credentials of the IAM Application Role assumed by your Panorama application:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;panoramasdk&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;backpack.kvs&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;KVSTelescope&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;KVSFileCredentialsHandler&lt;/span&gt;

&lt;span class="c1"&gt;# You might want to read these values from 
# Panorama application parameters
&lt;/span&gt;&lt;span class="n"&gt;stream_region&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;'us-east-1'&lt;/span&gt;
&lt;span class="n"&gt;stream_name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;'panorama-video'&lt;/span&gt;

&lt;span class="c1"&gt;# The example Dockerfile writes static 
# configuration variables to this file
# If you change the .env file path in the 
# Dockerfile, you should change it also here
&lt;/span&gt;&lt;span class="n"&gt;DOTENV_PATH&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;'/panorama/.env'&lt;/span&gt;

&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;Application&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;panoramasdk&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;node&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nb"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="n"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="c1"&gt;# ...
&lt;/span&gt;        &lt;span class="n"&gt;credentials_handler&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;KVSFileCredentialsHandler&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;telescope&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;KVSTelescope&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;stream_region&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;stream_region&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;stream_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;stream_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;credentials_handler&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;credentials_handler&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;dotenv_path&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;DOTENV_PATH&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="c1"&gt;# This call opens the streaming pipeline:
&lt;/span&gt;        &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;telescope&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;start_streaming&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="c1"&gt;# called from video processing loop:
&lt;/span&gt;    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;process_streams&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;streams&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;inputs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;video_in&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;get&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;idx&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;stream&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nb"&gt;enumerate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;streams&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;

            &lt;span class="c1"&gt;# Process the stream, for example with:
&lt;/span&gt;            &lt;span class="c1"&gt;# self.process_media(stream)
&lt;/span&gt;
            &lt;span class="c1"&gt;# TODO: eventually multiplex streams 
&lt;/span&gt;            &lt;span class="c1"&gt;# to a single frame
&lt;/span&gt;            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;idx&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;telescope&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;stream&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;image&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If everything works well, you can watch the restreamed video on the &lt;a href="https://console.aws.amazon.com/kinesisvideo/home"&gt;Kinesis Video Streams page&lt;/a&gt; of the AWS console. Certainly, you can modify the image before sending it to Telescope: draw annotations on it based on the inference result of the deep-learning model.&lt;/p&gt;

&lt;h2&gt;
  
  
  Annotations
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;Annotations&lt;/em&gt; and &lt;em&gt;annotation drivers&lt;/em&gt; provide a unified way to draw annotations on different rendering backends. Currently, two annotation drivers are implemented:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;PanoramaMediaAnnotationDriver&lt;/code&gt; allows you to draw on panoramasdk.media object, and&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;OpenCVImageAnnotationDriver&lt;/code&gt; allows you to draw on an OpenCV image (numpy array) object.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--v_J1Qxhz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vkx808nr4cpf2pfceoky.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--v_J1Qxhz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vkx808nr4cpf2pfceoky.jpeg" alt="A photo of colorful pencils" width="880" height="615"&gt;&lt;/a&gt;Photo by &lt;a href="https://unsplash.com/@lucasgwendt"&gt;Lucas George Wendt&lt;/a&gt; on &lt;a href="https://unsplash.com/"&gt;Unsplash&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The library can draw two types of annotations: labels and boxes.&lt;/p&gt;

&lt;h3&gt;
  
  
  Using annotations
&lt;/h3&gt;

&lt;p&gt;Depending on the available backends, you can create one or more annotation driver instances at the beginning of the video frame processing loop. During the process of a single frame, you are expected to collect all annotations to be drawn on the frame in a python collection (for example, in a &lt;code&gt;list&lt;/code&gt;). When the processing is finished, you call the &lt;code&gt;render&lt;/code&gt; method on any number of drivers, passing the same collection of annotations. All coordinates used in annotation are normalized to the &lt;code&gt;[0; 1)&lt;/code&gt; range.&lt;/p&gt;

&lt;p&gt;Example usage:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;panoramasdk&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;backpack.annotation&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;Point&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;LabelAnnotation&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;RectAnnotation&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;TimestampAnnotation&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;OpenCVImageAnnotationDriver&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
    &lt;span class="n"&gt;PanoramaMediaAnnotationDriver&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;Application&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;panoramasdk&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;node&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nb"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="n"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="c1"&gt;# self.telescope = ... 
&lt;/span&gt;        &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;panorama_driver&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;PanoramaMediaAnnotationDriver&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;cv2_driver&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;OpenCVImageAnnotationDriver&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="c1"&gt;# called from video processing loop:
&lt;/span&gt;    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;process_streams&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;streams&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;inputs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;video_in&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;get&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;idx&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;stream&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nb"&gt;enumerate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;streams&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="n"&gt;annotations&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
                &lt;span class="n"&gt;TimestampAnnotation&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
                &lt;span class="n"&gt;RectAnnotation&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                    &lt;span class="n"&gt;point1&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;Point&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;0.1&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; 
                    &lt;span class="n"&gt;point2&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;Point&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.9&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;0.9&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="p"&gt;),&lt;/span&gt;
                &lt;span class="n"&gt;LabelAnnotation&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                    &lt;span class="n"&gt;point&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;Point&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; 
                    &lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s"&gt;'Hello World!'&lt;/span&gt;
                &lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="p"&gt;]&lt;/span&gt;
            &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;panorama_driver&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;render&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;annotations&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;stream&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

            &lt;span class="c1"&gt;# TODO: eventually multiplex streams to a 
&lt;/span&gt;            &lt;span class="c1"&gt;# single frame
&lt;/span&gt;            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;idx&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;cv2_driver&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;render&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                    &lt;span class="n"&gt;annotations&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
                    &lt;span class="n"&gt;stream&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;image&lt;/span&gt;
                &lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="c1"&gt;# self.telescope.put(stream.image)
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Postface
&lt;/h2&gt;

&lt;p&gt;Even if Telescope can be a helpful tool, its usage might raise two concerns that you should consider carefully. We discourage using Telescope in a production environment: it is a development aid or and a debugging tool.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PEY6nda4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vtzdgtuep77n7dxenbw0.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PEY6nda4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vtzdgtuep77n7dxenbw0.jpeg" alt="A photo of a mountain path with a table saying danger" width="880" height="495"&gt;&lt;/a&gt;Photo by &lt;a href="https://unsplash.com/@greg_rosenke"&gt;Greg Rosenke&lt;/a&gt; on &lt;a href="https://unsplash.com/"&gt;Unsplash&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The first concern is of technical nature. Currently, the application code in a Panorama app does not have direct access to the onboard GPU; thus, all video encoding done by Telescope run on the device’s CPU. This behavior could slow down the appliance: streaming a single output stream with Telescope could require anything between 10–30% of the CPU capacity of the device.&lt;/p&gt;

&lt;p&gt;The second concern is related to data protection. The Panorama appliance is designed to protect the video streams being processed. It has two ethernet interfaces to separate the network of the video cameras (typically a closed-circuit local area network) from the device’s Internet access. Using Telescope, you relay the video stream from the protected, closed-circuit camera network to the public Internet. You should carefully examine the data protection requirements of your application and the camera network before using Telescope. Moreover, you are responsible for keeping private all AWS credentials used by Telescope, as required by &lt;a href="https://aws.amazon.com/compliance/shared-responsibility-model/"&gt;AWS Shared Responsibility Model&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where to go from here?
&lt;/h2&gt;

&lt;p&gt;&lt;code&gt;Telescope&lt;/code&gt; is part of &lt;code&gt;Backpack&lt;/code&gt;, a broader set of tools that aims to help software development on AWS Panorama. Other components will be presented in a future post, so keep tuned and follow us to get updated.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Backpack&lt;/code&gt; is &lt;a href="https://github.com/Neosperience/backpack"&gt;open source and available on GitHub&lt;/a&gt;. You can build and run the &lt;a href="https://github.com/Neosperience/backpack/blob/master/examples/Dockerfile"&gt;example docker container&lt;/a&gt; on an ARM64 based system. For example, suppose you’ve already set up the &lt;a href="https://github.com/aws-samples/aws-panorama-samples/blob/main/docs/EnvironmentSetup.md"&gt;Panorama Test Utility&lt;/a&gt; on a t4g type EC2 instance. In this case, you can build and launch a shell in this container with the following commands on the EC2 instance:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$ &lt;/span&gt;git clone https://github.com/Neosperience/backpack.git
&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;cd &lt;/span&gt;backpack/examples
&lt;span class="nv"&gt;$ &lt;/span&gt;docker build &lt;span class="nb"&gt;.&lt;/span&gt; &lt;span class="nt"&gt;-t&lt;/span&gt; my_backpack_example:1.0
&lt;span class="nv"&gt;$ &lt;/span&gt;docker run &lt;span class="nt"&gt;-it&lt;/span&gt; my_backpack_example:1.0 bash
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The container can be built and executed also on a MacBook Pro with the new M1 ARM-based chip.&lt;/p&gt;

&lt;p&gt;The backpack library is extensively documented with docstrings. You can read the &lt;a href="https://s3.eu-west-1.amazonaws.com/github-ci.experiments.neosperience.com/Neosperience/backpack/docs/index.html"&gt;detailed API docs online&lt;/a&gt; or using the &lt;code&gt;python3&lt;/code&gt; interpreter in the container shell:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;root@my_backpack_example:/# python3
&lt;span class="o"&gt;&amp;gt;&amp;gt;&amp;gt;&lt;/span&gt; import backpack.telescope, backpack.kvs, backpack.annotation
&lt;span class="o"&gt;&amp;gt;&amp;gt;&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;help&lt;/span&gt;&lt;span class="o"&gt;(&lt;/span&gt;backpack.telescope&lt;span class="o"&gt;)&lt;/span&gt;
&lt;span class="o"&gt;&amp;gt;&amp;gt;&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;help&lt;/span&gt;&lt;span class="o"&gt;(&lt;/span&gt;backpack.kvs&lt;span class="o"&gt;)&lt;/span&gt;
&lt;span class="o"&gt;&amp;gt;&amp;gt;&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;help&lt;/span&gt;&lt;span class="o"&gt;(&lt;/span&gt;backpack.annotation&lt;span class="o"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can also try other &lt;code&gt;Telescope&lt;/code&gt; implementations found in Backpack. For example, &lt;code&gt;RTSPTelescope&lt;/code&gt; allows you to host an RTSP server right in the application container, and connect directly to your Panorama appliance with an RTSP client to monitor your application.&lt;/p&gt;

&lt;p&gt;Let me know how you use &lt;code&gt;Telescope&lt;/code&gt;, which new features you would like to see and stay tuned for future posts about the other components of &lt;code&gt;Backpack&lt;/code&gt;.&lt;/p&gt;



&lt;h2&gt;
  
  
  About the author
&lt;/h2&gt;

&lt;p&gt;Janos Tolgyesi is an AWS Community Builder working as Machine Learning Solution Architect at Neosperience. He has worked with ML technologies for five years and with AWS infrastructure for eight years. He loves building things, let it be a &lt;a href="https://www.neosperience.com/solutions/people-analytics/"&gt;video analytics application on the edge&lt;/a&gt; or a &lt;a href="https://www.neosperience.com/solutions/user-insight/"&gt;user profiler based on clickstream events&lt;/a&gt;. You can find me here, on &lt;a href="https://dev.to/mrtj"&gt;dev.to&lt;/a&gt;, on &lt;a href="https://twitter.com/jtolgyesi"&gt;Twitter&lt;/a&gt;, &lt;a href="https://towardsdatascience.com/@janos.tolgyesi"&gt;Medium&lt;/a&gt;, and &lt;a href="http://linkedin.com/in/janostolgyesi"&gt;LinkedIn&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The open-source project of Backpack was supported by &lt;a href="https://www.neosperience.com/"&gt;Neosperience&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I would like to express my special thanks to &lt;br&gt;
&lt;a href="https://twitter.com/bianchiluca"&gt;Luca Bianchi&lt;/a&gt; for proofreading this article.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>machinelearning</category>
      <category>python</category>
      <category>computervision</category>
    </item>
  </channel>
</rss>
