<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: kapil kumawat</title>
    <description>The latest articles on DEV Community by kapil kumawat (@kapil_kumawat).</description>
    <link>https://dev.to/kapil_kumawat</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/kapil_kumawat"/>
    <language>en</language>
    <item>
      <title>Manage Amazon S3 Buckets Using Event Notifications With AWS SQS</title>
      <dc:creator>kapil kumawat</dc:creator>
      <pubDate>Mon, 09 May 2022 05:35:53 +0000</pubDate>
      <link>https://dev.to/kapil_kumawat/manage-amazon-s3-buckets-using-event-notifications-with-aws-sqs-2hp2</link>
      <guid>https://dev.to/kapil_kumawat/manage-amazon-s3-buckets-using-event-notifications-with-aws-sqs-2hp2</guid>
      <description>&lt;p&gt;Recently I came across a problem where the admin team should get notified when some user deletes any object from S3 object. If you also have a similar requirement where the team should get notification whenever an object is created, deleted, or modified, this is where event notifications comes into the picture.&lt;/p&gt;

&lt;p&gt;You can use AWS S3 Event notifications feature to receive notifications to following destinations when certain events happen in the AWS S3 bucket.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Amazon Simple Notification Service (Amazon SNS) topics&lt;/li&gt;
&lt;li&gt;Amazon Simple Queue Service (Amazon SQS) queues&lt;/li&gt;
&lt;li&gt;AWS Lambda function&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this article, I will be configuring AWS SQL queues as a destination to track delete event from my S3 bucket.&lt;/p&gt;

&lt;p&gt;Configure AWS SQL queues to track delete event from S3 bucket&lt;br&gt;
&lt;em&gt;Step 1&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Login to AWS Console&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 2&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Go to S3 bucket and create a new bucket if it does not exist. I have already created a bucket with name ‘mydemobucket198’.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5hItZU_z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ebb4okdbo21vz5rgbrhw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5hItZU_z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ebb4okdbo21vz5rgbrhw.png" alt="Image description" width="880" height="190"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 3&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Go to Properties.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--eKE4nrsx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xivr3ebqun5nijnywuys.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--eKE4nrsx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xivr3ebqun5nijnywuys.png" alt="Image description" width="847" height="148"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 4&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Scroll down to the page until section Event Notification appears. Click on Create event notification button.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0B9eKfWv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7ab1mi0w97c45ltxd880.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0B9eKfWv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7ab1mi0w97c45ltxd880.png" alt="Image description" width="880" height="181"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 5&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;A window appears to fill details to create a new event notification. I have put ‘s3_notify’ as the event name.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_-UXU1l0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j7feqmvsih2l68mbbpcj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_-UXU1l0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j7feqmvsih2l68mbbpcj.png" alt="Image description" width="815" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 6&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;A list will appear for events that are currently supported like Object creation, Object removal, etc. I checked Object Removal for this article so whenever any user deletes any S3 object I will get a notification.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--LPF--Wez--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/b9y51egjvi6fe1cydpj9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--LPF--Wez--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/b9y51egjvi6fe1cydpj9.png" alt="Image description" width="771" height="439"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 7&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Go to Destination section where you can choose a destination to publish the event.&lt;/p&gt;

&lt;p&gt;I have chosen SQL queues so notification will be sent to SQS queues that can be read later by a server and perform action accordingly.&lt;/p&gt;

&lt;p&gt;Choose SQL queues if it's already created or create a new one. As I don't have any existing SQL queues, I will create a new one.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SeQLQXsY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/86k3h1b5688tk7uaibc5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SeQLQXsY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/86k3h1b5688tk7uaibc5.png" alt="Image description" width="803" height="550"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 8&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Go to Amazon SQS service and click on Create queue.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 9&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Enter name of SQS queue as ‘s3_delete_notify’ and choose all settings as default.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mfWIDVtK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mbng9hnpnb3zj4hl2e1m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mfWIDVtK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mbng9hnpnb3zj4hl2e1m.png" alt="Image description" width="880" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 10&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Once the SQS queue is created, go to Access Policy.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--AkRCDIWD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nfgc8ggtqqwngl2rft8q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--AkRCDIWD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nfgc8ggtqqwngl2rft8q.png" alt="Image description" width="880" height="271"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 11&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;We need to modify the Access Policy to send a message from S3 bucket when any object gets deleted.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 12&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Click on Edit and paste below Access Policy code -&lt;/p&gt;

&lt;p&gt;{&lt;br&gt;
"Version": "2012-10-17",&lt;br&gt;
"Id": "Policy1651140347168",&lt;br&gt;
"Statement": [&lt;br&gt;
{&lt;br&gt;
"Sid": "Stmt1651140341677",&lt;br&gt;
"Effect": "Allow",&lt;br&gt;
"Principal": "*",&lt;br&gt;
"Action": "sqs:SendMessage",&lt;br&gt;
"Resource": "arn:aws:sqs:ap-south-1:464473132183:s3_delete_notify"&lt;br&gt;
}&lt;br&gt;
]&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;Don't forget to replace the ARN value with your SQL queue.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Bx4G-61H--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/r1hb2g3avtfw7hozc6de.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Bx4G-61H--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/r1hb2g3avtfw7hozc6de.png" alt="Image description" width="880" height="444"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 13&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Refresh the page and select the SQS queue that we have just created in the above steps. Click on Save Changes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xJHjOhRI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mygbrcnm593fg4mc8eat.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xJHjOhRI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mygbrcnm593fg4mc8eat.png" alt="Image description" width="844" height="620"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 14&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Go to S3 bucket and upload some files.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--RDSPCB6F--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jr359go8ntu65hbjkicv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RDSPCB6F--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jr359go8ntu65hbjkicv.png" alt="Image description" width="880" height="388"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 15&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Select any file and click on Delete button. I have selected photo-1.jpeg and deleted that file.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 16&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Go to SQS queue ‘s3_delete_notify’ and click on Send and receive message button.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--QOTXrzxw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4nnywx142r0mqqowgu2a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--QOTXrzxw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4nnywx142r0mqqowgu2a.png" alt="Image description" width="880" height="124"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 17&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Click on Poll for messages button to pull messages.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 18&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;As you can see, selected ID is related to S3 object that has been deleted.&lt;/p&gt;

&lt;p&gt;There is another ID which was created by AWS to test initially when queue was created.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--soI-t-yt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g4a6moivhyfb19mww5hr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--soI-t-yt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g4a6moivhyfb19mww5hr.png" alt="Image description" width="880" height="200"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Step 19&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;After you click on Message ID, you can see the details in Message Body. In our case, we can see S3 bucket name with S3 object name photo-1.jpeg which was deleted.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YcfuYSyy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wnjh9meqsqhtp1og9b2p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YcfuYSyy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wnjh9meqsqhtp1og9b2p.png" alt="Image description" width="822" height="493"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Using S3 Event notification you can enable notification for events like creating, deleting, and modifying S3 objects using destinations Amazon SNS, Amazon SQS and AWS Lambda.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>architecture</category>
      <category>s3</category>
    </item>
    <item>
      <title>Migrating Limited Objects With Pre-Defined Prefix Using S3 Batch Replication</title>
      <dc:creator>kapil kumawat</dc:creator>
      <pubDate>Fri, 29 Apr 2022 08:53:12 +0000</pubDate>
      <link>https://dev.to/kapil_kumawat/migrating-limited-objects-with-pre-defined-prefix-using-s3-batch-replication-57g8</link>
      <guid>https://dev.to/kapil_kumawat/migrating-limited-objects-with-pre-defined-prefix-using-s3-batch-replication-57g8</guid>
      <description>&lt;p&gt;In my previous article, we learned about replicating existing objects between different AWS S3 buckets using S3 Batch Replication. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/kapil_kumawat/replicate-existing-objects-using-aws-s3-batch-replication-4g37"&gt;Replicate Existing Objects Using AWS S3 Batch Replication&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this article, we will learn how to migrate limited objects with a pre-defined prefix using S3 Batch Replication which can be done at the time of configuring the replication rule.&lt;/p&gt;

&lt;p&gt;Taking a use case where I don't want to replicate all objects from existing S3 source bucket to new target bucket but need certain files with specific names only, that is also now possible with the new update of AWS S3 replication of existing objects.&lt;/p&gt;

&lt;p&gt;How to get started&lt;br&gt;
&lt;strong&gt;Step 1&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;From the AWS S3 source bucket, you like to migrate objects starting with name ‘house’ as shown below -&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HelXgs-O--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mpltj3tf0njvaiqxta79.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HelXgs-O--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mpltj3tf0njvaiqxta79.png" alt="Image description" width="880" height="464"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Goto Management page and choose Create Replication Rule option.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Enter Replication Rule name.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qmgVK2H3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/89z6lgi1a7a0dwaadu5a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qmgVK2H3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/89z6lgi1a7a0dwaadu5a.png" alt="Image description" width="839" height="487"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Choose option ‘Limit the scope of this rule using one or more filters’.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 5&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the Prefix option, write the prefix value ‘house’ to limit the scope.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--u5W3JmEW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aq3wxz4aml6ewaveycda.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--u5W3JmEW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aq3wxz4aml6ewaveycda.png" alt="Image description" width="803" height="594"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 6&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Choose your destination bucket from S3 bucket list. You can replicate objects across different AWS Regions as Cross-Region Replication or can replicate in the same AWS Region as Same-Region Replication.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 7&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Choose an IAM role. With default setting AWS S3 creates a new IAM role with sufficient permission to complete this task. I chose my existing IAM role which I used earlier for replication.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Iod_hEqe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j12oiuqarty16gri31lw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Iod_hEqe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j12oiuqarty16gri31lw.png" alt="Image description" width="802" height="592"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 8&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once you click on Save, a Batch Operation window will appear showing all details that you configured in previous steps. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 9&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In Status, you will be asked for your confirmation to run this job.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--utJMsUge--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/a4fm6ay9jot2b0ghh5i2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--utJMsUge--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/a4fm6ay9jot2b0ghh5i2.png" alt="Image description" width="880" height="456"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 10&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Click on Run Job button.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 11&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now the job will be in running state and you will see status as Completed once replication job is completed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9K-zmnz9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/15a9ieoaxcqsvdv0b0ux.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9K-zmnz9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/15a9ieoaxcqsvdv0b0ux.png" alt="Image description" width="880" height="240"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 12&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Navigate to destination AWS S3 bucket and you will get files with specified prefix only.&lt;/p&gt;

&lt;p&gt;In my case, I use the prefix ‘house’ and can only see those objects after replication.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yAk16WpO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5mgv6wyo5bwhseua0c8r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yAk16WpO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5mgv6wyo5bwhseua0c8r.png" alt="Image description" width="880" height="406"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Using S3 Batch replication we can also replicate a specific set of objects using the prefix option in configuring the replication rule.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>awscommunity</category>
      <category>architecture</category>
    </item>
    <item>
      <title>Replicate Existing Objects Using AWS S3 Batch Replication</title>
      <dc:creator>kapil kumawat</dc:creator>
      <pubDate>Thu, 17 Feb 2022 06:23:40 +0000</pubDate>
      <link>https://dev.to/kapil_kumawat/replicate-existing-objects-using-aws-s3-batch-replication-4g37</link>
      <guid>https://dev.to/kapil_kumawat/replicate-existing-objects-using-aws-s3-batch-replication-4g37</guid>
      <description>&lt;p&gt;With new updates, you can replicate existing AWS S3 objects and synchronize AWS S3 buckets using new replication features.&lt;/p&gt;

&lt;p&gt;Until today, AWS S3 replication did not support replicating existing objects but now you can do it using AWS S3 batch replication. This is different from live replication which continuously and automatically replicates new objects across different S3 buckets located in different AWS accounts or AWS regions.&lt;/p&gt;

&lt;p&gt;Amazon Web Service S3 Replication is a low cost, fully managed feature that automatically replicates S3 objects between buckets in same AWS region using S3 &lt;strong&gt;Same-Region-Replication (SRR)&lt;/strong&gt; or across different AWS Region by using S3 &lt;strong&gt;Cross-Region-Replication (CRR)&lt;/strong&gt;. &lt;strong&gt;S3 Batch Replication&lt;/strong&gt; refills newly created buckets with existing objects, can migrate data across different accounts, retry objects that got failed or unable to replicate in existing replication run.&lt;/p&gt;

&lt;p&gt;You can configure S3 Batch replication using AWS SDK’s, AWS S3Console or AWS Command Line Interface (CLI).&lt;/p&gt;

&lt;p&gt;Here is the replication process diagram from AWS site-&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9Y7xWXVL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rk7s6ko4kfeotgjigf4t.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9Y7xWXVL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rk7s6ko4kfeotgjigf4t.JPG" alt="Image description" width="880" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;AWS S3 Batch Replication can help to do - &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Replicate Existing Objects&lt;/strong&gt; - S3 Batch Replication can be used to replicate objects that were added to buckets before configuring any replication rules.&lt;br&gt;
&lt;strong&gt;Replicate Objects that failed to replicate previously&lt;/strong&gt; - You can retry to replicate objects that were previously failed to replicate due to any reasons.&lt;br&gt;
&lt;strong&gt;Replicate Objects that were already replicated&lt;/strong&gt; - You might need to store multiple copies of your data to separate AWS accounts or different AWS Regions. S3 Batch can replicate existing objects to newly added destinations.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;How to get started&lt;/u&gt;&lt;/strong&gt;&lt;br&gt;
I have already created two AWS S3 buckets (replication-bucket1 and replication-bucket2) in the region us-east1 for this demo.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before configuring the replication rule, if Bucket Versioning is not enabled then you need to enable Bucket Versioning on source and target bucket else you will receive error message like this on replication configuration page,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--25BhfjIh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/07vg73lvw9zbypwp8ktm.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--25BhfjIh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/07vg73lvw9zbypwp8ktm.JPG" alt="Image description" width="825" height="155"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Go to the Properties page and enable Bucket Versioning.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fbc3ONTp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/twx6ipn8kwodhlc51k9l.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fbc3ONTp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/twx6ipn8kwodhlc51k9l.JPG" alt="Image description" width="827" height="577"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Go to S3 bucket list and select a source bucket (replication-bucket1) that contains objects for replication.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Go to Management page and choose Create Replication Rule option.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 5&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Enter Replication rule name&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--e3H7CE2I--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9682r9bixoejp92xuo2g.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--e3H7CE2I--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9682r9bixoejp92xuo2g.JPG" alt="Image description" width="857" height="492"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 6&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;After filling required details and creating rule, you will get a prompt asking if you want to replicate existing objects.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4fvHs_Hj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gm7zq05x8luuyqtz7o4z.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4fvHs_Hj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gm7zq05x8luuyqtz7o4z.JPG" alt="Image description" width="605" height="293"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 7&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If you click option Yes, then you will get redirected to a Create Batch operations job. Choose the default option to Automatically run the job when it's ready.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ThRnP_75--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8s4nwrfgkn5dl39ievb5.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ThRnP_75--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8s4nwrfgkn5dl39ievb5.JPG" alt="Image description" width="838" height="358"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 8&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You are recommended to choose the option Generate completion report which will contain results of replication job. If you don't have a required IAM role for this then keep the default setting and AWS S3 will create a new IAM role with sufficient permission to run this Batch operation.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dTaGxYrT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/yl3g6mz999qeu2p4f52p.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dTaGxYrT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/yl3g6mz999qeu2p4f52p.JPG" alt="Image description" width="804" height="611"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 9&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once you Save this job, you can check status on the Batch Operations page. Job Status will keep changing from configuring -&amp;gt; in progress -&amp;gt; completion during this process.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--uOUju-ob--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hzzuvn2djgoispymbexk.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--uOUju-ob--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hzzuvn2djgoispymbexk.JPG" alt="Image description" width="880" height="394"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 10&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can go to your destination bucket and confirm new objects has been replicated.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MonOgIT1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jr4c0v9p2kbpk0oeetfu.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MonOgIT1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jr4c0v9p2kbpk0oeetfu.JPG" alt="Image description" width="880" height="608"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;br&gt;
Using new replication feature it is easy to replicate existing S3 objects between different S3 buckets in same AWS region or different AWS Regions or account.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>architecture</category>
    </item>
  </channel>
</rss>
