<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Bira</title>
    <description>The latest articles on DEV Community by Bira (@bthiban).</description>
    <link>https://dev.to/bthiban</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/bthiban"/>
    <language>en</language>
    <item>
      <title>How to Mount an S3 Bucket into an ECS Container</title>
      <dc:creator>Bira</dc:creator>
      <pubDate>Sat, 26 Aug 2023 02:57:05 +0000</pubDate>
      <link>https://dev.to/bthiban/how-to-mount-an-s3-bucket-into-an-ecs-container-246l</link>
      <guid>https://dev.to/bthiban/how-to-mount-an-s3-bucket-into-an-ecs-container-246l</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;In this blog post, we will show you how to mount an S3 bucket into an ECS container. This allows you to access the files in the S3 bucket from within the container.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Steps:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create an S3 bucket.&lt;/li&gt;
&lt;li&gt;Create an IAM role that allows the ECS container to access the S3 bucket.&lt;/li&gt;
&lt;li&gt;Create an ECS task definition that mounts the S3 bucket into the container.&lt;/li&gt;
&lt;li&gt;Deploy the ECS task definition.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Detailed Steps:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;To create an S3 bucket, you can use the AWS Console, the AWS CLI, or the AWS SDKs.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To create an IAM role, you can use the AWS Console, the AWS CLI, or the AWS SDKs. The IAM role must have the following permissions: &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    - s3:ListBucket
    - s3:GetObject
    - s3:PutObject
&lt;/code&gt;&lt;/pre&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;To create an ECS task definition, you can use the AWS Console, the AWS CLI, or the AWS SDKs. The task definition must specify the following:&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    - The container image that you want to use.
    - The mount point for the S3 bucket.
    - The IAM role that you created in step 2.
&lt;/code&gt;&lt;/pre&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;To deploy the ECS task definition, you can use the AWS Console, the AWS CLI, or the AWS SDKs.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Here is an example of an ECS task definition that mounts an S3 bucket:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "taskDefinition": {
    "family": "my-task-family",
    "containerDefinitions": [
      {
        "name": "my-container",
        "image": "my-container-image",
        "mountPoints": [
          {
            "containerPath": "/data",
            "sourceVolume": "my-s3-volume",
            "readOnly": true
          }
        ]
      }
    ],
    "volumes": [
      {
        "name": "my-s3-volume",
        "host": {
          "sourcePath": "/mnt/my-s3-bucket"
        },
        "dockerVolumeConfiguration": {
          "driver": "s3fs",
          "options": {
            "s3Url": "https://s3.amazonaws.com/my-bucket",
            "accessKeyId": "my-access-key-id",
            "secretAccessKey": "my-secret-access-key"
          }
        }
      }
    ]
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this example, the container is named my-container and it is using the image my-container-image. The S3 bucket is mounted at the /data path in the container. The S3 bucket is also configured with the IAM role that you created in step 2.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this blog post, we showed you how to mount an S3 bucket into an ECS container. This allows you to access the files in the S3 bucket from within the container.&lt;/p&gt;

&lt;p&gt;I hope this helps!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>containers</category>
      <category>docker</category>
      <category>s3</category>
    </item>
    <item>
      <title>Inserting test data into DynamoDB local</title>
      <dc:creator>Bira</dc:creator>
      <pubDate>Fri, 18 Aug 2023 03:18:49 +0000</pubDate>
      <link>https://dev.to/bthiban/inserting-test-data-into-dynamodb-local-5mn</link>
      <guid>https://dev.to/bthiban/inserting-test-data-into-dynamodb-local-5mn</guid>
      <description>&lt;p&gt;DynamoDB is a fully managed NoSQL database service that offers high performance, scalability, and durability. It is a popular choice for storing large amounts of data that need to be accessed quickly.&lt;/p&gt;

&lt;p&gt;One way to insert data into DynamoDB is to use a CSV file. This can be a convenient way to load data into DynamoDB, especially if you have a large amount of data to load.&lt;/p&gt;

&lt;p&gt;In this blog post, we will show you how to insert test data into DynamoDB local using Python and CSV.&lt;/p&gt;

&lt;p&gt;Prerequisites&lt;/p&gt;

&lt;p&gt;Before you begin, you will need the following:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;A Python development environment
The boto3 library
A CSV file with your test data
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h2&gt;
  
  
  Step 1: Create a DynamoDB Local instance
&lt;/h2&gt;

&lt;p&gt;To start, you need to create a DynamoDB Local instance. This can be done by following the instructions in the DynamoDB Local documentation: &lt;a href="https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.html"&gt;https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.html&lt;/a&gt;.&lt;/p&gt;
&lt;h2&gt;
  
  
  Step 2: Create a DynamoDB table
&lt;/h2&gt;

&lt;p&gt;Once you have created a DynamoDB Local instance, you need to create a DynamoDB table. This can be done by using the DynamoDb Admin (&lt;a href="https://www.npmjs.com/package/dynamodb-admin"&gt;https://www.npmjs.com/package/dynamodb-admin&lt;/a&gt;) or NoSQL Workbench (&lt;a href="https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/workbench.html"&gt;https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/workbench.html&lt;/a&gt;)&lt;/p&gt;
&lt;h2&gt;
  
  
  Step 3: Generate Data using Excel.
&lt;/h2&gt;

&lt;p&gt;You can easily regenerate requisite data from excel or Google Sheets and download as CSV files and then import into your DynamoDB&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--amvj45SJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/exjnseg83i539ppsuebi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--amvj45SJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/exjnseg83i539ppsuebi.png" alt="Creating dummy data using excel" width="800" height="297"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Step 4: Load the CSV file into DynamoDB
&lt;/h2&gt;

&lt;p&gt;Now that you have created a DynamoDB table, you can load the CSV file into DynamoDB. The following code loads the CSV file test_data.csv into the test_table table:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import boto3
import csv

# Create a DynamoDB client
dynamodb = boto3.resource('dynamodb', endpoint_url='http://localhost:8000')

# Open the CSV file
with open('./csv/user.csv', 'r') as csvfile:

    # Create a reader object
    reader = csv.reader(csvfile, delimiter=',')

    # Get the header
    header = next(reader, None)

    # Iterate over the rows in the CSV file
    for row in reader:

        # Create a dictionary to represent the item
        item = {}
        for i, column in enumerate(header):
            item[column] = row[i]
            print(column, row[i])

        # Put the item in the DynamoDB table
        dynamodb.Table('test_table').put_item(Item=item)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 5: Verify the data in DynamoDB
&lt;/h2&gt;

&lt;p&gt;Once you have loaded the CSV file into DynamoDB, you can verify the data by querying the test_table table using the tools mentioned in the step 2.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xpNIx-ZJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lz9qrt6e5g32j8mwumna.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xpNIx-ZJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lz9qrt6e5g32j8mwumna.png" alt="Viewing from DynamoDB Admin" width="800" height="251"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In this blog post, we showed you how to insert test data into DynamoDB local using Python and CSV. We hope this helps you get started with DynamoDB.&lt;/p&gt;

&lt;p&gt;For more information on DynamoDB, please visit the DynamoDB documentation: &lt;a href="https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/"&gt;https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>dynamodb</category>
      <category>python</category>
      <category>docker</category>
    </item>
  </channel>
</rss>
