DEV Community

Cover image for Parsing & Loading Data from S3 to DynamoDB with Lambda Function
Muhammed Ashraf
Muhammed Ashraf

Posted on

Parsing & Loading Data from S3 to DynamoDB with Lambda Function

Many scenarios require you to work with data formatted as JSON, and you want to extract and process the data then save it into table for future use

In this article we are going to discuss loading JSON formatted data from S3 bucket into DynamoDB table using Lambda function

Prerequisites

  1. IAM user with permissions to upload objects to S3
  2. Lambda Execution role with permissions to S3 & DynamoDB

Architecture & Components

The architecture below shows we are using 3 AWS services

  1. S3 bucket
  2. Lambda Function
  3. DynamoDB Table

Image description

A brief description of services below as refreshment:

  • S3 Bucket: Object storage service with scalability, security & high-performance service will be used as our storage service for the data
  • Lambda Function: Serverless compute service which allows you to run your code without worrying about the infrastructure, easy to setup and support a lot of programming languages, we will utilize it to run our code and deploy our logic.
  • DynamoDB: Serverless NoSQL database used to store our data in tables, we will use it to store our processed data by the Lambda function

Flow

  1. User will upload JSON file to S3 bucket through console or CLI which behind the scenes PutObject API
  2. Object is Uploaded successfully, S3 Event will be triggered to invoke the lambda function to load & process the file
  3. Lambda will process the data and load it into DynamoDB table

Implementation Steps

We will walk through the steps & configuration for deploying the above diagram

1- Create Lambda Function with below Configuration

Author from Scratch
Function Name: ParserDemo
Runtime: Python 3.1x

Leave the rest as default
After Lambda created, you will need to modify the timeout configuration & Execution role as below:

Image description

Image description

I wrote this python code to perform the logic

import json
import boto3

s3_client = boto3.client('s3')
dynamodb = boto3.resource('dynamodb')

def lambda_handler(event, context):



    bucket_name = event['Records'][0]['s3']['bucket']['name'] # Getting the bucket name from the event triggered by S3
    object_key = event['Records'][0]['s3']['object']['key'] # Getting the Key of the item when the data is uploaded to S3
    print(f"Bucket: {bucket_name}, Key: {object_key}")


    response = s3_client.get_object(
    Bucket=bucket_name,
    Key=object_key
)


    # We will convert the streamed data into bytes
    json_data = response['Body'].read()
    string_formatted = json_data.decode('UTF-8') #Converting data into string

    dict_format_data = json.loads(string_formatted) #Converting Data into Dictionary 


    # Inserting Data Into DynamoDB

    table = dynamodb.Table('DemoTable')
    if isinstance(dict_format_data, list): #check if the file contains single record
        for record in dict_format_data:
            table.put_item(Item=record)

    elif isinstance(dict_format_data, dict): # check if the file contains multiple records 
        table.put_item(Item=data)

    else:  
        raise ValueError("Not Supported Format") # Raise error if nothing matched

Enter fullscreen mode Exit fullscreen mode

2- Create S3 bucket

BucketName: use a unique name

leave the rest of configuration as default

Add the created S3 bucket as a trigger to lambda function as below:

Image description

Image description

3- Create a Table in the DynamoDB with the below configuration

Table Name: DemoTable
Partition Key: UserId
Table Settings: Customized
Capacity Mode: Provisioned

To Save costs configure the provisioned capacity units for read/write with low value (1 or 2 units)

Image description

Image description

Now the setup is ready, you can test it by uploading a file to the S3, then you will find items created on the DynamoDB table with the records you have uploaded into the file.

CloudWatch Logs for Lambda Function

Image description

DynamoDB Items

Image description

I hope you found this interesting and please let me know if you have any comments.

References

S3 API
DynamoDB API
boto3 practice for AWS services

Top comments (0)