<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Syed Rehan</title>
    <description>The latest articles on DEV Community by Syed Rehan (@redmancodes).</description>
    <link>https://dev.to/redmancodes</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/redmancodes"/>
    <language>en</language>
    <item>
      <title>Build and test automotive protobuf (binary) payload with AWS IoT</title>
      <dc:creator>Syed Rehan</dc:creator>
      <pubDate>Thu, 26 Jan 2023 13:22:53 +0000</pubDate>
      <link>https://dev.to/iotbuilders/build-and-test-automotive-protobuf-payload-with-aws-iot-4ip6</link>
      <guid>https://dev.to/iotbuilders/build-and-test-automotive-protobuf-payload-with-aws-iot-4ip6</guid>
      <description>&lt;p&gt;In this post we will look at how we can utilise binary payload such as Protobuf with AWS IoT, to get started and set the environment up we will use Cloud 9 environment and bootstrap to get started. We will utilise the newly released feature of the AWS IoT Rules engine (Decode function) and see how it can work well with protobuf (binary payload).&lt;/p&gt;

&lt;h3&gt;
  
  
  What is Protobuf
&lt;/h3&gt;

&lt;p&gt;Protocol Buffers (protobuf) is an open-source data format used to serialize structured data in a compact, binary form. It's used for transmitting data over networks or storing it in files. Protobuf allows you to send data in small packet sizes and at a faster rate than other messaging formats.&lt;/p&gt;

&lt;h3&gt;
  
  
  How does AWS IoT support Protobuf
&lt;/h3&gt;

&lt;p&gt;AWS IoT Core Rules support protobuf by providing the &lt;a href="https://docs.aws.amazon.com/iot/latest/developerguide/iot-sql-functions.html#iot-sql-decode-base64"&gt;decode(value, decodingScheme)&lt;/a&gt; SQL function, which allows you to decode protobuf-encoded message payloads to JSON format and route them to downstream services. &lt;/p&gt;

&lt;p&gt;Testing Protobuf with AWS IoT Core Rules engine (Flowchart)&lt;br&gt;
We will use Python to compile our JSON payload and send to AWS IoT Core. We will build FileDescriptorSet to Amazon S3 as AWS IoT Core rules use this DescriptorSet to decode the payload.&lt;/p&gt;
&lt;h4&gt;
  
  
  Let's look at the flow of how it works:
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MmOgBIrX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/z3zy7h76h4cor95t17q8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MmOgBIrX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/z3zy7h76h4cor95t17q8.png" alt="Image description" width="880" height="506"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Environment setup:&lt;/p&gt;

&lt;p&gt;Let's use Cloud9 template from &lt;a href="https://catalog.workshops.aws/getstartedwithawsiot/en-US/chapter2-lets-begin/10-step1"&gt;here&lt;/a&gt; Pick closest region to your location. I will use Ireland as it's closest to me and get started.&lt;/p&gt;

&lt;p&gt;Once our environment is set up, we will need to install the following tools and compiler for protobuf (open source)&lt;/p&gt;
&lt;h2&gt;
  
  
  Acquire the Protobuf Compiler
&lt;/h2&gt;

&lt;p&gt;we will use open source protobuf compiler (protoc) to build our binary payload from JSON, You clone and install the compiler from GitHub &lt;a href="https://github.com/protocolbuffers/protobuf/releases"&gt;here&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Create a Protobuf Message Type
&lt;/h2&gt;

&lt;p&gt;Let's create a Protobuf message type file (.proto file extension). Let go through the code and see what is actually going on here (aptly named: automotive.proto):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;syntax = "proto3";

package automotive;

message Automotive {
    uint32 battery_level = 1;
    string battery_health = 2;
    double battery_discharge_rate = 3;
    uint32 wheel_rpm = 4;
    uint32 mileage_left = 5;
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Within our 'Automotive' message, we have 5 variables containing: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Battery level&lt;/li&gt;
&lt;li&gt;Battery health&lt;/li&gt;
&lt;li&gt;Battery discharge rate&lt;/li&gt;
&lt;li&gt;wheel RPM&lt;/li&gt;
&lt;li&gt;Mileage left on the battery &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These are some variables which are used by EV car manufactures (ones I have worked with in the past) as this gives up-to-date data to user as well as obtain latest information from the cloud for consumers on the state of their journey and getting closer GPS location based on the discharge rate vs journey length left.&lt;/p&gt;

&lt;p&gt;Now that we have this created, let's create 'File Descriptor Set' using our Proto file.&lt;/p&gt;
&lt;h2&gt;
  
  
  Create a FileDescriptorSet
&lt;/h2&gt;

&lt;p&gt;We will use our message definition file created earlier (automotive.proto) along with our protoc tooling to create File Descriptor Set (file)&lt;/p&gt;

&lt;p&gt;Let's use the following command:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;protoc \  
--descriptor_set_out=&amp;lt;OUTPUT_PATH&amp;gt;/automotive_descriptor_set_out.desc \   
--proto_path=&amp;lt;PROTO_PATH&amp;gt; \
--include_imports &amp;lt;PROTO_PATH&amp;gt;/automotive.proto
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;In this example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt; is the location where you’d like the FileDescriptorSet to be saved.&lt;/li&gt;
&lt;li&gt; is the directory where your automotive.proto file is stored.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Upload FileDescriptorSet to Amazon S3
&lt;/h2&gt;

&lt;p&gt;We will use Amazon S3 to upload this FileDescriptorSet file to decipher. Upload the FileDescriptorSet to an S3 bucket in the account where you’ll be creating a Rule. &lt;/p&gt;
&lt;h3&gt;
  
  
  Attach Policy to Amazon S3 bucket
&lt;/h3&gt;

&lt;p&gt;Attach to the following policy on the bucket, so Rules Engine service has the permissions to read the object and its metadata:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Statement1",
            "Effect": "Allow",
            "Principal": {
                "Service": "iot.amazonaws.com"
            },
            "Action": "s3:Get*",
            "Resource": "arn:aws:s3:::&amp;lt;BUCKET_NAME&amp;gt;/&amp;lt;FILENAME&amp;gt;"
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;In this example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt; is the name of the S3 bucket where the FileDescriptorSet was uploaded.&lt;/li&gt;
&lt;li&gt; is the object key for the FileDescriptorSet in S3.&lt;/li&gt;
&lt;li&gt;Use the iot.amazonaws.com service principal, as shown in the above policy&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Note: The S3 bucket needs to be created in the same region where testing against rules will occur.&lt;/em&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Create AWS IoT Rule
&lt;/h2&gt;

&lt;p&gt;We will use AWS IoT Rules to decode the payload and invoke Amazon S3 bucket to store this, as many customers use Amazon S3 either as data lake or as conduit to their data lake further down stream.&lt;/p&gt;
&lt;h3&gt;
  
  
  Create an IoT Rule with the following SQL expression:
&lt;/h3&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;SELECT VALUE decode(encode(*, 'base64'), 'proto', '&amp;lt;BUCKET_NAME&amp;gt;', '&amp;lt;FILENAME&amp;gt;', 'test_proto', 'TestProto') FROM '&amp;lt;TOPIC&amp;gt;'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;In this example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt; must match the bucket name we created earlier.&lt;/li&gt;
&lt;li&gt; must match the filename from an earlier step.&lt;/li&gt;
&lt;li&gt; should be replaced with the MQTT topic where you’ll publish messages into AWS IoT Core.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;
  
  
  Bucket policy
&lt;/h4&gt;

&lt;p&gt;In order to validate the SQL expression’s result, you’ll need an action associated with the Rule. In this example, I’m using an S3 Action, which uploads the result to the same S3 bucket where the FileDescriptorSet is stored. (Using the same bucket for configurations and runtime data is not ideal. But it’s just a simple way to set up this manual test easily.) The TopicRule payload (saved in a file called rule.json, in my case) for this setup looks like:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "sql": "SELECT VALUE decode(encode(*, 'base64'), 'proto', 'protobuf-docs-manual-test', 'descriptor_set_out.desc', 'test_proto', 'TestProto') FROM 'test/proto'",
    "ruleDisabled": false,
    "awsIotSqlVersion": "2016-03-23",
    "actions": [
        {
            "s3": {
                "bucketName": "&amp;lt;BUCKET_NAME&amp;gt;",
                "key": "result",
                "roleArn": "&amp;lt;ROLE_ARN&amp;gt;"
            }
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;In this example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt; is the name of the S3 bucket where the FileDescriptorSet was uploaded.&lt;/li&gt;
&lt;li&gt; is values from your account / configuration.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;
  
  
  Setup AWS IoT Rule
&lt;/h4&gt;

&lt;p&gt;You can set up AWS IoT Rule by going to AWS IoT Core console and navigating to Message Routing &amp;gt; Rules &amp;gt; Create rule&lt;/p&gt;

&lt;p&gt;As I have created below with action invoking the Amazon S3 bucket:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EBAJGFjM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kq09nxr2hfj6beql0sfg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EBAJGFjM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kq09nxr2hfj6beql0sfg.png" alt="Image description" width="880" height="486"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h5&gt;
  
  
  Use AWS CLI if you prefer:
&lt;/h5&gt;

&lt;p&gt;You can create the Rule using AWS CLI by using the following command if you prefer:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ aws --region us-east-1 --endpoint-url https://us-east-1.iot.amazonaws.com iot create-topic-rule --rule-name &amp;lt;RULE_NAME&amp;gt; --topic-rule-payload file://rule.json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;If you want to verify the Rule was created with the correct data, you can run following AWS CLI Command:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws --region us-east-1 --endpoint-url https://us-east-1.iot.amazonaws.com iot get-topic-rule --rule-name &amp;lt;RULE_NAME&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h2&gt;
  
  
  Create Binary payload
&lt;/h2&gt;

&lt;p&gt;You can use programming language of your choice. In this instance, I'm using 'protoc' which can translate human readable message into binary payload by using our 'FileDescriptorSet' created earlier.&lt;/p&gt;
&lt;h3&gt;
  
  
  Create a payload file
&lt;/h3&gt;

&lt;p&gt;Let's create ASCII text file as below, saving as payload.txt:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;battery_level: 75,
battery_health: "good",
battery_discharge_rate: 18.00,
wheel_rpm: 4000,
mileage_left: 150
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Use the following command to create a binary version of this:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cat payload.txt | protoc --encode=automotive.Automotive --descriptor_set_in=automotive_descriptor_set_out.desc &amp;gt; binary_payload_auto
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h2&gt;
  
  
  Publish message and verify result
&lt;/h2&gt;

&lt;p&gt;We have completed all the steps needed as we have binary test payload created. The last remaining step is to publish the test payload into IoT and validate that the message was decoded as the expected JSON result and stored in Amazon S3 bucket.&lt;/p&gt;

&lt;p&gt;Replace  with the endpoint and  applicable to you, run the following command to publish the test message into IoT Core:&lt;/p&gt;

&lt;p&gt;aws --region us-east-1 --endpoint-url https:// iot-data publish --topic "" --payload fileb://binary_payload&lt;/p&gt;

&lt;p&gt;Now we should have payload decoded and stored in the Amazon S3 bucket (one we created earlier and setup as part of our AWS IoT rules action) and you can download and review the payload to verify.&lt;/p&gt;

&lt;p&gt;When the file is downloaded, we should be able to see the decoded payload:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{"batteryLevel":75,"batteryHealth":"good","batteryDischargeRate":18.0,"wheelRpm":4000,"mileageLeft":150}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Amazon S3 bucket should show the latest created file.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lZr93s1I--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cjedgu9cnxf0u72kom9c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lZr93s1I--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cjedgu9cnxf0u72kom9c.png" alt="Image description" width="856" height="367"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;By following these steps, we can set up our binary payload and program to send this automatically from any device connected to AWS IoT and be able to have this data decoded and stored in Amazon S3.&lt;/p&gt;
&lt;h2&gt;
  
  
  Further Questions?
&lt;/h2&gt;

&lt;p&gt;Reach out to me on &lt;a href="https://twitter.com/SyedCloud"&gt;Twitter&lt;/a&gt; or &lt;a href="https://www.linkedin.com/in/iamsyed/"&gt;LinkedIn&lt;/a&gt; always happy to help!&lt;/p&gt;


&lt;div class="ltag__user ltag__user__id__1011303"&gt;
    &lt;a href="/redmancodes" class="ltag__user__link profile-image-link"&gt;
      &lt;div class="ltag__user__pic"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Umg1OI4G--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://res.cloudinary.com/practicaldev/image/fetch/s--k6hAJU8j--/c_fill%2Cf_auto%2Cfl_progressive%2Ch_150%2Cq_auto%2Cw_150/https://dev-to-uploads.s3.amazonaws.com/uploads/user/profile_image/1011303/ed2e2682-2310-4bd2-8b9c-a534f6af21e3.jpeg" alt="redmancodes image"&gt;
      &lt;/div&gt;
    &lt;/a&gt;
  &lt;div class="ltag__user__content"&gt;
    &lt;h2&gt;
&lt;a class="ltag__user__link" href="/redmancodes"&gt;Syed Rehan&lt;/a&gt;Follow
&lt;/h2&gt;
    &lt;div class="ltag__user__summary"&gt;
      &lt;a class="ltag__user__link" href="/redmancodes"&gt;I work as Developer Advocate (DA) in AWS IoT service team. I have a keen desire to help builders out there one binary at a time solve their daily IoT and security problems. 

&lt;/a&gt;
    &lt;/div&gt;
  &lt;/div&gt;
&lt;/div&gt;



</description>
    </item>
  </channel>
</rss>
