DEV Community

Cover image for Populate DynamoDB with BatchWriteItem & TypeScript
Radzion Chachura
Radzion Chachura

Posted on • Originally published at radzion.com

Populate DynamoDB with BatchWriteItem & TypeScript

Watch on YouTube

Let's populate a DynamoDB table using BatchWriteItem.

I have a list of coupon codes for my product, and I want to move them into a DynamoDB table.

import uuid from "uuid/v4"

import { tableName } from "../src/shared/db/tableName"
import { documentClient } from "../src/shared/db"

const batchWriteItemMax = 25

const appSumoCodesNumber = 10000

const populateAppSumoCodes = async () => {
  const codes = Array.from({ length: appSumoCodesNumber }, () => uuid())
  const items = codes.map((id) => ({ id }))

  const itemsBatchesNumber = Math.ceil(items.length / batchWriteItemMax)
  await Promise.all(
    Array.from({ length: itemsBatchesNumber }, (_, i) => i).map(
      async (batchIndex) => {
        const batchItems = items.slice(
          batchWriteItemMax * batchIndex,
          batchWriteItemMax * (batchIndex + 1)
        )

        await documentClient
          .batchWrite({
            RequestItems: {
              [tableName.appSumoCodes]: batchItems.map((Item) => ({
                PutRequest: {
                  Item,
                },
              })),
            },
          })
          .promise()
      }
    )
  )
}

populateAppSumoCodes()
Enter fullscreen mode Exit fullscreen mode

First, I convert them into a list of items.

We can't just upload all of them at once. DynamoDB allows inserting a maximum of 25 items at once.

We need to calculate how many batches we should have. For example, if there are 60 items, we'll insert 3 batches.

After that, we create an array with the length of itemsBatchesNumber, and use Promise.all to iterate over it.

To populate the table, we slice a batch from a list of items and pass it to the batchWrite method of DynamoDB documentClient.

Now let's set environment variables and run the migration with ts-node.

Image of Timescale

🚀 pgai Vectorizer: SQLAlchemy and LiteLLM Make Vector Search Simple

We built pgai Vectorizer to simplify embedding management for AI applications—without needing a separate database or complex infrastructure. Since launch, developers have created over 3,000 vectorizers on Timescale Cloud, with many more self-hosted.

Read full post →

Top comments (0)

Sentry image

See why 4M developers consider Sentry, “not bad.”

Fixing code doesn’t have to be the worst part of your day. Learn how Sentry can help.

Learn more