<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Kohei (Max) MATSUSHITA</title>
    <description>The latest articles on DEV Community by Kohei (Max) MATSUSHITA (@ma2shita).</description>
    <link>https://dev.to/ma2shita</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/ma2shita"/>
    <language>en</language>
    <item>
      <title>How to set up cross-account access to Amazon Kinesis Data Streams</title>
      <dc:creator>Kohei (Max) MATSUSHITA</dc:creator>
      <pubDate>Thu, 29 Feb 2024 11:24:11 +0000</pubDate>
      <link>https://dev.to/aws-heroes/how-to-set-up-cross-account-access-to-amazon-kinesis-data-streams-4io0</link>
      <guid>https://dev.to/aws-heroes/how-to-set-up-cross-account-access-to-amazon-kinesis-data-streams-4io0</guid>
      <description>&lt;h2&gt;
  
  
  Overview - Amazon Kinesis Data Streams and separated AWS accounts
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://aws.amazon.com/kinesis/data-streams/"&gt;Amazon Kinesis Data Streams&lt;/a&gt; (hereafter referred to as KDS) is a managed data processing service designed for the real-time collection of high traffic of data and facilitating its transfer to subsequent AWS services. It is particularly suited for handling streaming data, such as logs, where order matters, making it a commonly used service for IoT data collection. For example, it can be specified as the data export destination for &lt;a href="https://aws.amazon.com/monitron/"&gt;Amazon Monitron&lt;/a&gt;, which allows for predictive maintenance of industrial equipment through machine learning.&lt;/p&gt;

&lt;p&gt;KDS serves as an intermediary between data producers and data consumers, employing this architecture for its operation.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fndnh9jtivsaz91bwdhrv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fndnh9jtivsaz91bwdhrv.png" alt="Typical Architecture using Amazon Kinesis Data Streams" width="696" height="268"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When fully leveraging AWS Cloud, it's not uncommon to operate separate AWS accounts for purposes of distinction and cost management. In such cases, there might be a need to process streaming data collected by KDS in another AWS account.&lt;/p&gt;

&lt;p&gt;This blog introduces the procedure for sharing a data stream with another AWS account and referencing it from AWS Lambda (specifying the Lambda function as a trigger) using &lt;a href="https://docs.aws.amazon.com/streams/latest/dev/controlling-access.html"&gt;the resource-based policy of Amazon Kinesis Data Streams&lt;/a&gt; (&lt;a href="https://aws.amazon.com/jp/about-aws/whats-new/2023/11/amazon-kinesis-data-streams-cross-account-access-aws-lambda/"&gt;update as of November 2023&lt;/a&gt;). A data stream, in this context, is akin to a pipeline through which data flows.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Architecture and Setup
&lt;/h2&gt;

&lt;p&gt;The architecture is as follows.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fseudfjcmr2nmcs71yylq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fseudfjcmr2nmcs71yylq.png" alt="The architecture for cross-account access" width="788" height="336"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Points for Setup
&lt;/h3&gt;

&lt;p&gt;Here are the key points for setup:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The execution role of the Lambda function on the data processing side(Account B) requires the &lt;code&gt;AWSLambdaKinesisExecutionRole&lt;/code&gt; policy.&lt;/li&gt;
&lt;li&gt;For the KDS data stream on the data producer side(Account A), the resource-based policy should specify the IAM role ARN of the execution role of the Lambda function on the data processing side as the Principal, and set the Allow Actions to include &lt;code&gt;kinesis:DescribeStream&lt;/code&gt;, &lt;code&gt;kinesis:DescribeStreamSummary&lt;/code&gt;, &lt;code&gt;kinesis:GetRecords&lt;/code&gt;, &lt;code&gt;kinesis:GetShardIterator&lt;/code&gt; and &lt;code&gt;kinesis:ListShards&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Especially regarding the second point on resource-based policies, it's important to note that specifying &lt;code&gt;kinesis:DescribeStream&lt;/code&gt; might be missed when using the dialog in the management console. It needs to be manually added using the JSON editor (as of Feb. 2024, reported).&lt;/p&gt;

&lt;p&gt;The following official documents might also be helpful:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/streams/latest/dev/controlling-access.html#sharing-data-streams"&gt;Sharing your data stream with another account&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/streams/latest/dev/resource-based-policy-examples.html#Resource-based-policy-examples-lambda"&gt;Sharing access with cross-account AWS Lambda functions&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Steps
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;The steps involve several cross-accounts: data producer side(Account A) → data processing side(Account B) → data producer side → data processing side. Make sure not to mix up the targets.&lt;/li&gt;
&lt;li&gt;Everything must be in the same region; sharing is not possible across different regions (e.g., if the data stream is in us-west-2 and the Lambda function is in ap-northeast-1). If you wish to send data to a different region, consider the architecture using Amazon EventBridge mentioned in the epilogue.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Step 1) On the data producer side - Account A
&lt;/h4&gt;

&lt;h5&gt;
  
  
  1-1. Create a data stream in &lt;a href="https://console.aws.amazon.com/kinesis"&gt;Amazon Kinesis&lt;/a&gt; (e.g., &lt;code&gt;kds-sharing-example1&lt;/code&gt;)
&lt;/h5&gt;

&lt;p&gt;See here for the creation method (&lt;a href="https://docs.aws.amazon.com/streams/latest/dev/tutorial-stock-data-kplkcl-create-stream.html"&gt;Step1: Create Data Stream&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgevvlsv6lfzub2096ryl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgevvlsv6lfzub2096ryl.png" alt="Create Data Stream" width="800" height="611"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For testing or small data volumes, the "Provisioned" capacity mode with "1" provisioned shard is sufficient.&lt;/p&gt;

&lt;p&gt;NOTE: that while this guide assumes the creation of a new data stream, existing data streams can also be repurposed.&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 2) On the data processing side - Account B
&lt;/h4&gt;

&lt;h5&gt;
  
  
  2-1. Create a Lambda function in &lt;a href="https://console.aws.amazon.com/lambda/home#/functions"&gt;AWS Lambda&lt;/a&gt; (e.g., &lt;code&gt;kds-reader1&lt;/code&gt;).
&lt;/h5&gt;

&lt;p&gt;See here for the creation method (&lt;a href="https://docs.aws.amazon.com/lambda/latest/dg/getting-started.html#getting-started-create-function"&gt;Create a Lambda function with the console&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg1w5xyu8lez6u5jx2eld.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg1w5xyu8lez6u5jx2eld.png" alt="Create a Lambda function" width="593" height="649"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The source code is as follows (Python 3). It simply emits the &lt;code&gt;event&lt;/code&gt; to Amazon CloudWatch Logs, which is sufficient for operational verification. After modifying the code source as below, click "Deploy" to deploy.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;lambda_handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;NOTE: There's a template for testing the function with Amazon Kinesis Data Streams sample data, which can be used for testing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjsqyvp8qmnet3m6shi61.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjsqyvp8qmnet3m6shi61.png" alt="Test data for Lambda function" width="334" height="227"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h5&gt;
  
  
  2-2. Attach the &lt;code&gt;AWSLambdaKinesisExecutionRole&lt;/code&gt; policy to the execution role of &lt;code&gt;kds-reader1&lt;/code&gt;.
&lt;/h5&gt;

&lt;p&gt;After viewing the details of &lt;code&gt;kds-reader1&lt;/code&gt;, go to "Cofiguration" &amp;gt; "Permissions" and click on the role name assigned to the execution role to view the role's settings (in the figure below, click on &lt;code&gt;kds-reader1-role-mp67l1v2&lt;/code&gt;).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmofjpf8qmw0qw0b1ydyn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmofjpf8qmw0qw0b1ydyn.png" alt="Cofiguration &amp;gt; Permissions on Lambda Function" width="598" height="352"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Go to "Add Permission" &amp;gt; "Attach Policy" for the permission policy to display the list of policies to attach.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn262g551rv7h6etgjxvy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn262g551rv7h6etgjxvy.png" alt="Attach Policy on AWS IAM role" width="800" height="177"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select &lt;code&gt;AWSLambdaKinesisExecutionRole&lt;/code&gt; from the list of "Other Permission Policies" and then click "Add Permission".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdrafoa1di02iqc15bx04.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdrafoa1di02iqc15bx04.png" alt="AWSLambdaKinesisExecutionRole" width="800" height="200"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Policy addition is complete.&lt;br&gt;
Check for &lt;code&gt;AWSLambdaKinesisExecutionRole&lt;/code&gt; is added to the list of allowed policies as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgleumsy050qi3e6ao3oa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgleumsy050qi3e6ao3oa.png" alt="list of allowed policies" width="656" height="302"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;NOTE: We assume that the role for executing Lambda functions is an IAM role that is automatically created and appended when a Lambda function is created. Existing IAM roles can also be used.&lt;/p&gt;
&lt;h5&gt;
  
  
  2-3. Noted the ARN of the IAM role.
&lt;/h5&gt;

&lt;p&gt;This IAM role's ARN (in this step, &lt;code&gt;kds-reader1-role-q6zcv9kq&lt;/code&gt;) will be used on next step.&lt;/p&gt;
&lt;h4&gt;
  
  
  Step 3) Back on the data producer side - Account A
&lt;/h4&gt;
&lt;h5&gt;
  
  
  3-1. configure resource-based policy for &lt;code&gt;kds-sharing-example1&lt;/code&gt; in &lt;a href="https://console.aws.amazon.com/kinesis"&gt;Amazon Kinesis&lt;/a&gt;
&lt;/h5&gt;

&lt;p&gt;After viewing the details of &lt;code&gt;kds-sharing-example1&lt;/code&gt;, go to "Data stream sharing" &amp;gt; "Create Policy".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa4ekgpnn2kgj2qpou99w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa4ekgpnn2kgj2qpou99w.png" alt="Create Policy" width="800" height="418"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In Policy Details, select the Visual Editor, check "Data stream sharing throughput read Access," enter the ARN of the IAM role you wrote down earlier in "Specify Principal(s)," and click "Create Policy."&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx5znkdjqjq6uewvlkg16.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx5znkdjqjq6uewvlkg16.png" alt="Visual Editor" width="800" height="628"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;NOTE: The principal should be the ARN of the IAM role; specifying an ARN other than the IAM role's ARN (e.g., the ARN of a Lambda function or data stream) will result in an error and the policy cannot be created.&lt;/p&gt;

&lt;p&gt;When the resource-based policy appears, click "Edit" to display the JSON editor. Here, add &lt;code&gt;"kinesis:DescribeStream",&lt;/code&gt; to the list of Actions as shown below. Finally, click "Save Changes".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkdxpeijqd35gric765lk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkdxpeijqd35gric765lk.png" alt="Add a privilege" width="374" height="177"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;NOTE: If the same privileges have already been added at the visual editor stage, the above editing process is not necessary.&lt;/p&gt;

&lt;p&gt;Configuration of the resource-based policy is complete.&lt;br&gt;
Check to that the ARN of the IAM role is attached to the Principal and the five permissions are attached to the Action, as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqzlbpx642jq9hqg33cq6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqzlbpx642jq9hqg33cq6.png" alt="Resource-based policy" width="736" height="571"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h5&gt;
  
  
  3-2. Noted the ARN of &lt;code&gt;kds-sharing-example1&lt;/code&gt;.
&lt;/h5&gt;

&lt;p&gt;&lt;code&gt;kds-sharing-example1&lt;/code&gt; (KDS Stream's) ARN will be used on next step.&lt;/p&gt;
&lt;h4&gt;
  
  
  Step 4) Back on the data processing side - Account B
&lt;/h4&gt;

&lt;p&gt;4-1. set up a trigger for &lt;code&gt;kds-reader1&lt;/code&gt; in &lt;a href="https://console.aws.amazon.com/lambda/home#/functions"&gt;AWS Lambda&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After viewing the details of &lt;code&gt;kds-reader1&lt;/code&gt;, go to "Configuration" &amp;gt; "Triggers" &amp;gt; "Add trigger".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm9o76m16fiu0m14o50j9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm9o76m16fiu0m14o50j9.png" alt="Add a Trigger" width="800" height="348"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the Trigger configuration, select Kinesis for "Select a source". Then set the ARN of &lt;code&gt;kds-sharing-example1&lt;/code&gt; to "Kinesis stream" in the displayed contents. Leave the other items as they are and click "Add".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fstbxi3twy3jdkbc4m9gz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fstbxi3twy3jdkbc4m9gz.png" alt="Trigger Configuration" width="641" height="373"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;NOTE1: When the focus moves to the text box, the message "No item" is displayed. No problem, ignore.&lt;/p&gt;

&lt;p&gt;NOTE2: If you get an API error when clicking "Add", check the following two things (1) Resource-based policy permissions on the data stream side(Account A). In particular, make sure that &lt;code&gt;kinesis:DescribeStream&lt;/code&gt; is included. (2) Permissions for the Lambda function execution role. In particular, make sure that the &lt;code&gt;AWSLambdaKinesisExecutionRole&lt;/code&gt; policy is attached.&lt;/p&gt;

&lt;p&gt;Trigger configuration is complete.&lt;br&gt;
You can see that Kinesis has been added to the trigger (input source) of kds-reader1 as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fydqbwnjj95krj591i0yc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fydqbwnjj95krj591i0yc.png" alt="Configuration Function" width="609" height="414"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  How to check
&lt;/h3&gt;

&lt;p&gt;To check, send data to the data stream (kds-sharing-example1 in this example) on the data generator side(Account A) and check the Amazon CloudWatch Logs output on the data processor side(Account B).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/cloudshell/latest/userguide/welcome.html#how-to-get-started"&gt;AWS CloudShell&lt;/a&gt; on the data generator side(Account A) sends data to the data stream via AWS CLI.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws &lt;span class="nt"&gt;--cli-binary-format&lt;/span&gt; raw-in-base64-out &lt;span class="se"&gt;\&lt;/span&gt;
  kinesis put-record &lt;span class="nt"&gt;--stream-name&lt;/span&gt; kds-sharing-example1 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--partition-key&lt;/span&gt; DUMMY1 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--data&lt;/span&gt; &lt;span class="s1"&gt;'{"this_is": "test record"}'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If the result of the command execution is as follows, the data transmission has succeeded.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "ShardId": "shardId-000000000000",
    "SequenceNumber": "49649718468451075013017298672854645152715037125279481858"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If the following log is confirmed in the log of kds-reader1 in CloudWatch Logs on the data processing side (Account B), the setting was successful.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{'Records': [{'kinesis': {'kinesisSchemaVersion': '1.0', 'partitionKey': 'DUMMY1', 'sequenceNumber': '49649683864473953433843496278632352517587667908684677122', 'data': 'eyJ0aGlzX2lzIjogInRlc3QgcmVjb3JkIn0=', 'approximateArrivalTimestamp': 1709100911.667}, 'eventSource': 'aws:kinesis', 'eventVersion': '1.0', 'eventID': 'shardId-000000000000:49649683864473953433843496278632352517587667908684677122', 'eventName': 'aws:kinesis:record', 'invokeIdentityArn': 'arn:aws:iam::888800008888:role/service-role/kds-reader1-role-q6zcv9kq', 'awsRegion': 'us-east-1', 'eventSourceARN': 'arn:aws:kinesis:us-east-1:999900009999:stream/kds-sharing-example1'}]}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Epilogue - Architecture with Amazon EventBridge
&lt;/h2&gt;

&lt;p&gt;In this article introduced sharing data streams using Amazon Kinesis Data Streams resource-based policies. This will allow, for example, the Amazon Monitron data introduced at the beginning of this article to be used by other AWS accounts, which will give you more flexibility in the operation of your AWS account.&lt;/p&gt;

&lt;p&gt;Other possible architectures for using Amazon Kinesis Data Streams data streams with other AWS accounts include sending them through the Amazon EventBridge event bus.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi6h9koe8k6z4sv4r5d0c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi6h9koe8k6z4sv4r5d0c.png" alt="The architecture using Amazon EventBridge " width="800" height="600"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The advantage of this would be that it can be configured as a no-code and managed service. It can also be used across different regions, and although there is a fee for using Amazon EventBridge, the architecture is well worth it.&lt;/p&gt;

&lt;p&gt;Here are some URLs to help you create this architecture.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-pipes-kinesis.html"&gt;Amazon Kinesis stream as a source&lt;/a&gt; (&lt;a href="https://aws.amazon.com/jp/about-aws/whats-new/2023/11/amazon-kinesis-data-streams-eventbridge-pipes-console/"&gt;Update information&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-cross-account.html"&gt;Sending and receiving Amazon EventBridge events between AWS accounts&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-cross-region.html"&gt;Sending and receiving Amazon EventBridge events between AWS Regions&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I personally believe that the ability to create a configuration that matches the skills you possess and the type of operation you wish to achieve is the best part of building blocks.&lt;/p&gt;

&lt;p&gt;Not only this configuration, but it would be a good idea to TRY a configuration that is appropriate for the time, along with new features that may come out in the future!&lt;/p&gt;

&lt;p&gt;[EoT]&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Predictive Maintenance of Industrial machine with Amazon Monitron and What is “ISO 20816” ?</title>
      <dc:creator>Kohei (Max) MATSUSHITA</dc:creator>
      <pubDate>Thu, 28 Dec 2023 02:56:36 +0000</pubDate>
      <link>https://dev.to/aws-heroes/predictive-maintenance-of-industrial-machine-with-amazon-monitron-and-what-is-iso-20816--52k9</link>
      <guid>https://dev.to/aws-heroes/predictive-maintenance-of-industrial-machine-with-amazon-monitron-and-what-is-iso-20816--52k9</guid>
      <description>&lt;p&gt;In this article, learn about &lt;a href="https://aws.amazon.com/monitron/"&gt;Amazon Monitron&lt;/a&gt;, an AWS service that enables predictive maintenance for industrial machinery, and the &lt;a href="https://www.iso.org/standard/63180.html"&gt;ISO 20816&lt;/a&gt; standard used in the service.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Predictive Maintenance in Industry&lt;/strong&gt;: The article highlights the importance of predictive maintenance in the industrial sector, emphasizing the challenges of maintaining factory automation systems and other industrial machines, which are crucial for reliable operations.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon Monitron and ISO 20816&lt;/strong&gt;: Amazon Monitron, an AWS service, is introduced as a solution for predictive maintenance. It integrates sensors, machine learning models, and follows the ISO 20816 standard for machine vibration, to efficiently detect anomalies in industrial machinery.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Application of ISO 20816 in Amazon Monitron&lt;/strong&gt;: ISO 20816 is an international standard for evaluating machine vibration, used by Amazon Monitron to classify machines and assess their conditions. This system simplifies setting up predictive maintenance, even for machines without initial sensor systems.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Please see the following a medium article for details;&lt;br&gt;
&lt;a href="https://ma2shita.medium.com/predictive-maintenance-of-industrial-machine-with-amazon-monitron-and-what-is-iso-20816-9715f2d597b5"&gt;https://ma2shita.medium.com/predictive-maintenance-of-industrial-machine-with-amazon-monitron-and-what-is-iso-20816-9715f2d597b5&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>machinelearning</category>
      <category>iot</category>
    </item>
    <item>
      <title>AWS CDK stack for building Amazon EventBridge Pipes (SQS to CloudWatch Logs)</title>
      <dc:creator>Kohei (Max) MATSUSHITA</dc:creator>
      <pubDate>Fri, 10 Mar 2023 06:12:12 +0000</pubDate>
      <link>https://dev.to/aws-heroes/aws-cdkv2-stack-for-building-amazon-eventbridge-pipes-sqs-to-cloudwatch-logs-52n9</link>
      <guid>https://dev.to/aws-heroes/aws-cdkv2-stack-for-building-amazon-eventbridge-pipes-sqs-to-cloudwatch-logs-52n9</guid>
      <description>&lt;p&gt;I have released an AWS CDK(v2) stack that builds the following Amazon EventBridge Pipes.&lt;/p&gt;

&lt;p&gt;This stack provides a quick experience in building Amazon EventBridge Pipes.&lt;/p&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev.to%2Fassets%2Fgithub-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/ma2shita" rel="noopener noreferrer"&gt;
        ma2shita
      &lt;/a&gt; / &lt;a href="https://github.com/ma2shita/cdk-eventbridge-pipes-simplelogger" rel="noopener noreferrer"&gt;
        cdk-eventbridge-pipes-simplelogger
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Example an AWS CDKv2 for building the Amazon EventBridge Pipes that logs messages sent to an SQS queue to CloudWatch Log.
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;AWS CDKv2 example: SQS to CloudWatch Logs (w/ Enrichment by Lambda) on Amazon EventBridge Pipes&lt;/h1&gt;
&lt;/div&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/386b637ba8bc3a657683299faf13a5f1c1116fa32d4a762d42598d62df9f2bbc/68747470733a2f2f646f63732e676f6f676c652e636f6d2f64726177696e67732f642f652f32504143582d3176544f574861693045554169347933717748496245417a2d44576e39716d43524c31717036504b386f596c356f756e6339367556535a774b4a416a62443061435548707430356130625566486b45472f7075623f773d39323126683d333435"&gt;&lt;img src="https://camo.githubusercontent.com/386b637ba8bc3a657683299faf13a5f1c1116fa32d4a762d42598d62df9f2bbc/68747470733a2f2f646f63732e676f6f676c652e636f6d2f64726177696e67732f642f652f32504143582d3176544f574861693045554169347933717748496245417a2d44576e39716d43524c31717036504b386f596c356f756e6339367556535a774b4a416a62443061435548707430356130625566486b45472f7075623f773d39323126683d333435" alt="architecture"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Example an AWS CDKv2 for building the Amazon EventBridge Pipes that logs messages sent to an SQS queue to CloudWatch Log.&lt;br&gt;
It also includes an enrichment(but pass-throught) by a Lambda function.&lt;/p&gt;
&lt;p&gt;(ja) SQSキューに送信されたメッセージをCloudWatch Logにログ出力するAmazon EventBridge Pipesを構築する、AWS CDKv2 のサンプル。&lt;br&gt;
Lambda関数による強化(enrichment/内容は"何もしない")も含んでいます。&lt;/p&gt;
&lt;p&gt;Created by:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Queue (1)&lt;/li&gt;
&lt;li&gt;Log Group (1)&lt;/li&gt;
&lt;li&gt;Lambda function (1)&lt;/li&gt;
&lt;li&gt;Pipe (1)&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Requirements&lt;/h2&gt;
&lt;/div&gt;
&lt;ul&gt;
&lt;li&gt;AWS CDKv2 env. (&lt;strong&gt;AWS Cloud9 (It's very easy)&lt;/strong&gt;)
&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;cdk bootstrap&lt;/code&gt; must be done&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Build &amp;amp; Deploy&lt;/h2&gt;
&lt;/div&gt;
&lt;p&gt;Build:&lt;/p&gt;
&lt;div class="snippet-clipboard-content notranslate position-relative overflow-auto"&gt;&lt;pre class="notranslate"&gt;&lt;code&gt;git clone https://github.com/ma2shita/cdk-eventbridge-pipes-simplelogger.git
cd eventbridge-pipes-simplelogger/
npm install
cdk ls
# =&amp;gt; EventBridgePipesSimpleLoggerStack
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;
If you have "Newer version" in npm or aws_cdk then run to &lt;code&gt;npm install -g --force npm aws-cdk&lt;/code&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Deploy:&lt;/p&gt;
&lt;div class="snippet-clipboard-content notranslate position-relative overflow-auto"&gt;&lt;pre class="notranslate"&gt;&lt;code&gt;cdk deploy
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;
The target region is set to us-west-2. This is specified in &lt;code&gt;bin/event_bridge_pipes_simple_logger.ts&lt;/code&gt;.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Destroy:&lt;/p&gt;
&lt;div class="snippet-clipboard-content notranslate position-relative overflow-auto"&gt;&lt;pre class="notranslate"&gt;&lt;code&gt;cdk destroy
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;How it works&lt;/h2&gt;

&lt;/div&gt;
&lt;p&gt;Send SQS queue message then will record toeven CloudWatch Log…&lt;/p&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/ma2shita/cdk-eventbridge-pipes-simplelogger" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;If you are on AWS Cloud9, you can &lt;code&gt;cdk bootstrap&lt;/code&gt; immediately because the environment is ready up to the cdk command. &lt;/p&gt;

&lt;h2&gt;
  
  
  Contains "enrichment" Lambda functions
&lt;/h2&gt;

&lt;p&gt;"Enrichment" can invoke Lambda function and Step Function, reducing the amount of code compared to a stand-alone Lambda implementation.&lt;/p&gt;

&lt;p&gt;This CDKv2 stack contains Lambda functions that can be called from "enhancements". It is &lt;a href="https://github.com/ma2shita/cdk-eventbridge-pipes-simplelogger/blob/main/lambda/lambda_function.py" rel="noopener noreferrer"&gt;"do nothing" code&lt;/a&gt;, though, so please see it as a framework.&lt;br&gt;
How to customize it is described below.&lt;/p&gt;
&lt;h2&gt;
  
  
  Customization Points
&lt;/h2&gt;
&lt;h3&gt;
  
  
  On the "Enrichment" function
&lt;/h3&gt;

&lt;p&gt;The "enrichment" function allows editing of the payload. For example, you can lookup items from DynamoDB by the input then  append the results to the payload.&lt;/p&gt;

&lt;p&gt;For specifications of the Lambda function to be specified for enhancement, see here. &lt;br&gt;
&lt;a href="https://dev.to/ma2shita/aws-lambda-function-specifications-in-enrichment-in-amazon-eventbridge-pipes-5anb"&gt;AWS Lambda function specifications in "Enrichment" in Amazon EventBridge Pipes&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  On the "Target" of EventBridge Pipes
&lt;/h3&gt;

&lt;p&gt;Once you have changed the "target", remember to grant write permission to the destination service to the Pipes Execution role.&lt;br&gt;
In particular, &lt;code&gt;events:InvokeApiDestination&lt;/code&gt; is required if you specify an "API destination" that can send to an external API.&lt;/p&gt;
&lt;h2&gt;
  
  
  Conclusion and "CDKv2 for IoT 1-Click"
&lt;/h2&gt;

&lt;p&gt;I'd love to hear your feedback!&lt;br&gt;&lt;br&gt;
And, an AWS CDK(v2) stack for creating AWS IoT 1-Click projects is also available. Button data can be sent to Amazon EventBridge Pipes.&lt;/p&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev.to%2Fassets%2Fgithub-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/ma2shita" rel="noopener noreferrer"&gt;
        ma2shita
      &lt;/a&gt; / &lt;a href="https://github.com/ma2shita/cdk-iot1click-project-to-sqs" rel="noopener noreferrer"&gt;
        cdk-iot1click-project-to-sqs
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Example of an AWS CDKv2 for building the AWS IoT 1-Click project with sending to exists SQS queue Lambda function.
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;AWS CDKv2 example: IoT 1-Click to exists SQS queue&lt;/h1&gt;
&lt;/div&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/dec12bacb347eb8596b3ffb1b4fdc1411e52bff1a3b2136b399766d1039f8644/68747470733a2f2f646f63732e676f6f676c652e636f6d2f64726177696e67732f642f652f32504143582d3176544c6566444f594e32655044524b6c706d6a65777832382d41335767566c565f336c71456c47776a444d5134645370454c44365164677443726b766b445344745452477245524a72484d70465a4d2f7075623f773d36343526683d323035"&gt;&lt;img src="https://camo.githubusercontent.com/dec12bacb347eb8596b3ffb1b4fdc1411e52bff1a3b2136b399766d1039f8644/68747470733a2f2f646f63732e676f6f676c652e636f6d2f64726177696e67732f642f652f32504143582d3176544c6566444f594e32655044524b6c706d6a65777832382d41335767566c565f336c71456c47776a444d5134645370454c44365164677443726b766b445344745452477245524a72484d70465a4d2f7075623f773d36343526683d323035" alt="architecture"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Example of an AWS CDKv2 for building the AWS IoT 1-Click project with sending to exists SQS queue Lambda function.&lt;/p&gt;
&lt;p&gt;(ja) 既存のSQSキューへメッセージ送信をするIoT 1-Clickプロジェクトと、Lambda 関数を構築するAWS CDKv2の例。&lt;/p&gt;
&lt;p&gt;Created by:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Lambda function (1)&lt;/li&gt;
&lt;li&gt;IoT 1-Click project (1)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;NOT creates:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;SQS queue&lt;/li&gt;
&lt;li&gt;IoT 1-Click placement&lt;/li&gt;
&lt;li&gt;IoT 1-Click device registration&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Requirements&lt;/h2&gt;
&lt;/div&gt;
&lt;ul&gt;
&lt;li&gt;AWS CDKv2 env. (&lt;strong&gt;AWS Cloud9 (It's very easy)&lt;/strong&gt;)
&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;cdk bootstrap&lt;/code&gt; must be done&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;SQS queue
&lt;ul&gt;
&lt;li&gt;queue's ARN&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a href="https://aws.amazon.com/jp/iot-1-click/devices/" rel="nofollow noopener noreferrer"&gt;AWS IoT 1-Click device&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Build &amp;amp; Deploy&lt;/h2&gt;
&lt;/div&gt;
&lt;p&gt;Build:&lt;/p&gt;
&lt;div class="snippet-clipboard-content notranslate position-relative overflow-auto"&gt;&lt;pre class="notranslate"&gt;&lt;code&gt;git clone https://github.com/ma2shita/cdk-iot1click-project-to-sqs.git
cd iot1click-project-to-sqs/
npm install
cdk ls --context destSqsQueueArn=&amp;lt;SQS_QUEUE_ARN&amp;gt;
#=&amp;gt; Iot1ClickProjectStack
# e.g.) cdk ls --context destSqsQueueArn=arn:aws:sqs:REGION:ACCOUNT:sqs-queue-name
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;p&gt;Deploy:&lt;/p&gt;
&lt;div class="snippet-clipboard-content notranslate position-relative overflow-auto"&gt;&lt;pre class="notranslate"&gt;&lt;code&gt;cdk deploy --context destSqsQueueArn=&amp;lt;SQS_QUEUE_ARN&amp;gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;
The target region is set to us-west-2. This is specified in &lt;code&gt;bin/iot1click_project.ts&lt;/code&gt;.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Destroy:&lt;/p&gt;
&lt;div class="snippet-clipboard-content notranslate position-relative overflow-auto"&gt;&lt;pre class="notranslate"&gt;&lt;code&gt;cdk destroy --context destSqsQueueArn=&amp;lt;SQS_QUEUE_ARN&amp;gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;How it works&lt;/h2&gt;

&lt;/div&gt;
&lt;p&gt;Register devices in IoT 1-Click then add the device to the project (created with this CDK).&lt;br&gt;
Pressing the button sends the message…&lt;/p&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/ma2shita/cdk-iot1click-project-to-sqs" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;So, It would be nice if IoT 1-Click would support sending directly to SQS. :-)&lt;/p&gt;

&lt;p&gt;EoT&lt;/p&gt;

</description>
      <category>aws</category>
      <category>awscdk</category>
      <category>eventdriven</category>
    </item>
    <item>
      <title>AWS Lambda function specifications in "Enrichment" in Amazon EventBridge Pipes</title>
      <dc:creator>Kohei (Max) MATSUSHITA</dc:creator>
      <pubDate>Wed, 08 Mar 2023 05:40:29 +0000</pubDate>
      <link>https://dev.to/ma2shita/aws-lambda-function-specifications-in-enrichment-in-amazon-eventbridge-pipes-5anb</link>
      <guid>https://dev.to/ma2shita/aws-lambda-function-specifications-in-enrichment-in-amazon-eventbridge-pipes-5anb</guid>
      <description>&lt;h2&gt;
  
  
  TL;DR
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Input to the "Enrichment" function&lt;/strong&gt;&lt;br&gt;
The first argument "event" of the Lambda function is passed as an array of payloads from each AWS service.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Output from the "Enrichment" function&lt;/strong&gt;&lt;br&gt;
return ... is passed to the "target" of the EventBridge Pipes.&lt;/p&gt;
&lt;h2&gt;
  
  
  Explanation
&lt;/h2&gt;

&lt;p&gt;This section is based on a Lambda function (as will be described later) that receives an event from a source or filter and passes the event to a target without doing anything (pass-through).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;lambda_handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;handler&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Input to the function of "Enrichment"
&lt;/h3&gt;

&lt;p&gt;The first argument "event" of the Lambda function is passed as a payload from the source or filter. The second argument "context" can also be used.&lt;/p&gt;

&lt;p&gt;For example, if you set Amazon SQS as the source and put the data into the SQS queue of MessageBody as &lt;code&gt;{"v":1}&lt;/code&gt;, the result of &lt;code&gt;print(event)&lt;/code&gt;(in case of python) will look like this.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;messageId&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;8c8b72a2-85f0-47be-bf9f-81edff5197e9&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;receiptHandle&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;.....&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;body&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;{"v":1}&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;attributes&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ApproximateReceiveCount&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;1&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;SentTimestamp&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;1672477371472&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;SenderId&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;AID................KM&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ApproximateFirstReceiveTimestamp&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;1672477371473&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;messageAttributes&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{},&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;md5OfMessageAttributes&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;None&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;md5OfBody&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;455c0c10d425837001a72a84239ac362&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;eventSource&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;aws:sqs&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;eventSourceARN&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;arn:aws:sqs:REGION:............:SQS_NAME&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;awsRegion&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;REGION&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The first level of "event" is an array. Therefore, fetch within a Lambda function can be made to, for example, &lt;code&gt;messageId&lt;/code&gt; as &lt;code&gt;event[0]['messageId']&lt;/code&gt;(in case of python).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;lambda_handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;messageId&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;"event" is an array. Therefore, I thought multiple payloads would be entered. So far, no multiple payloads have been entered.&lt;br&gt;
For example, even if 3 data were queued in SQS at once using &lt;code&gt;send-message-batch&lt;/code&gt;, the Lambda function specified as Enrichment was executed 3 times individually.&lt;/p&gt;
&lt;h3&gt;
  
  
  Output from the function of "Enrichment"
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;return ...&lt;/code&gt; is passed to the target of the EventBridge Pipes.&lt;/p&gt;

&lt;p&gt;The following code is a Lambda function that passes &lt;code&gt;{"foo": "bar"}&lt;/code&gt; "target".&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;lambda_handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;foo&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;bar&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If "empty" is returned, the target will not be executed. Empty means &lt;code&gt;{}&lt;/code&gt;, &lt;code&gt;[]&lt;/code&gt; and &lt;code&gt;""&lt;/code&gt;. The following code is a Lambda function(Python3) that does not execute "target".&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;lambda_handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Implicit body data parsing
&lt;/h3&gt;

&lt;p&gt;One point to note when processing data with Enrichment is &lt;a href="https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-pipes-input-transformation.html#input-transform-implicit"&gt;Implicit body data parsing&lt;/a&gt;.&lt;br&gt;
If Enrichment is not used, Implicit body data parsing is applied and can be referenced in the input transformer on the target side as &lt;code&gt;&amp;lt;$.body.foo&amp;gt;&lt;/code&gt;.&lt;br&gt;
If Enrichment is used, Implicit body data parsing is not applied.&lt;br&gt;
Therefore, when using Enrichment, it is better to design the input transformer to be fully formatted in the Enrichment Lambda function before passing it on to the target, to simplify the input transformer.&lt;/p&gt;
&lt;h2&gt;
  
  
  My strategy for implementing Enrichment Lambda functions
&lt;/h2&gt;

&lt;p&gt;At first, if enrichment is to be used, it would be better to implement without target input transformers. This is to concentrate the formatting implementation.&lt;/p&gt;

&lt;p&gt;Let's consider &lt;a href="https://api.slack.com/methods/chat.postMessage"&gt;Slack's &lt;code&gt;chat.postMessage&lt;/code&gt; API&lt;/a&gt; . That requires a &lt;code&gt;channel&lt;/code&gt; and &lt;code&gt;text&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The first approach implements formatting to the API in a function. The second approach is to pass the necessary information to target, and the formatting to the API itself is done in the input transformer.&lt;/p&gt;
&lt;h3&gt;
  
  
  Approach 1: "Rewriting."
&lt;/h3&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;lambda_handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;payload&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;channel&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;XXX&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;body&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;payload&lt;/span&gt;
&lt;span class="c1"&gt;# =&amp;gt; {
#  'channel': 'XXX',
#  'text': "STRING...."
#}
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Input transformar in "target" is nothing.&lt;/p&gt;
&lt;h3&gt;
  
  
  Approach 2: "Adding."
&lt;/h3&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;lambda_handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;_enrichment&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;slack&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;channel&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;XXX&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;body&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="c1"&gt;# =&amp;gt; {
#  (元々のevent),
#  '_enrichment': {
#    'slack': {
#      'channel': 'XXX',
#      'text': 'STRING....'
#     }
#  }
#}
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Input transformar&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "channel": &amp;lt;$._enrichment.slack.channel&amp;gt;,
  "text": &amp;lt;$._enrichment.slack.text&amp;gt;
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Amazon EventBridge Pipes allow AWS Lambda functions to be specified in the target. Therefore, some may argue that there is no reason for Enrichment to handle Lambda functions with&lt;br&gt;
It appears that there is no need to use a Lambda function in the Enrichment.&lt;/p&gt;

&lt;p&gt;Eliminating the implementation of calling external APIs from within the Lambda function would have the advantage of simplifying the Lambda function.&lt;/p&gt;

&lt;p&gt;EoT&lt;/p&gt;

</description>
      <category>lambda</category>
      <category>aws</category>
    </item>
    <item>
      <title>AWS IoT EduKit is what? What can we learn from it?</title>
      <dc:creator>Kohei (Max) MATSUSHITA</dc:creator>
      <pubDate>Sat, 13 Feb 2021 01:13:21 +0000</pubDate>
      <link>https://dev.to/aws-heroes/aws-iot-edukit-is-what-what-can-we-learn-from-it-2a39</link>
      <guid>https://dev.to/aws-heroes/aws-iot-edukit-is-what-what-can-we-learn-from-it-2a39</guid>
      <description>&lt;p&gt;AWS IoT EduKit is a new device from AWS that was launched by AWS re:Invent 2020. Introduce what do we learn from the kit.&lt;/p&gt;

&lt;p&gt;2 Benefits, 2 Learn points.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Eliminates the need to create client certificates and operate CA for IoT core connectivity&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Eliminates the need for measures to prevent leakage of the private key&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;How to register to IoT Core with the client certificate in ATECC608A Trust&amp;amp;GO&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;How to use the private key in the ATECC608A Trust&amp;amp;GO when connecting to IoT Core&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Please see the following blog for details.&lt;/p&gt;


&lt;div class="ltag__link"&gt;
  &lt;a href="https://ma2shita.medium.com/aws-iot-edukit-is-what-what-can-we-learn-from-it-96039238d712" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__pic"&gt;
      &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--AqoP7uoA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:88:88/1%2AYVhVRmfl9BOBOAzX-m2jYQ.jpeg" alt='Kohei "Max" Matsushita'&gt;
    &lt;/div&gt;
  &lt;/a&gt;
  &lt;a href="https://ma2shita.medium.com/aws-iot-edukit-is-what-what-can-we-learn-from-it-96039238d712" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__content"&gt;
      &lt;h2&gt;AWS IoT EduKit is what? What can we learn from it? | Medium&lt;/h2&gt;
      &lt;h3&gt;Kohei "Max" Matsushita ・ &lt;time&gt;Feb 12, 2021&lt;/time&gt; ・ 
      &lt;div class="ltag__link__servicename"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YjpYcCMa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev.to/assets/medium-f709f79cf29704f9f4c2a83f950b2964e95007a3e311b77f686915c71574fef2.svg" alt="Medium Logo"&gt;
        ma2shita.Medium
      &lt;/div&gt;
    &lt;/h3&gt;
&lt;/div&gt;
  &lt;/a&gt;
&lt;/div&gt;


</description>
      <category>aws</category>
      <category>iot</category>
    </item>
  </channel>
</rss>
