DEV Community

Cover image for Building a Scalable Backend with DynamoDB Streams — My Journey Before the AWS User Group Pwani Talk
Gilbert Chris
Gilbert Chris

Posted on

Building a Scalable Backend with DynamoDB Streams — My Journey Before the AWS User Group Pwani Talk

When I agreed to give an online presentation to the AWS User Group Pwani, I knew I didn’t want to show slides full of theory. I wanted something real. Something I could build, demo, and explain — a project that actually does something.

So I decided to create a small but scalable backend system using:

API Gateway

AWS Lambda

DynamoDB

DynamoDB Streams

Image describing a scalable backend architecture

Everything worked perfectly during the live session (thank God 😅),
but the journey of building it before the talk…
that’s where the real learning happened.

This blog is about that journey — what I built, what surprised me, and the gems I picked up along the way.

Let’s dive in.

🌱 Why This Project?

Before the presentation, I wanted to answer one question:

“How can you build a backend that can scale to millions of users without managing servers?”

AWS has many ways to answer that question, but DynamoDB Streams stood out because:

they’re simple but powerful

they’re event-driven

they allow your backend to react to data in real-time

If you’ve ever wanted to track changes in your database — who registered, who updated their profile, who deleted their account — Streams are the perfect tool.

🧠 DynamoDB Streams — Explained Like You're 10

Think of your DynamoDB table as a notebook.

Every time someone writes in it:

adds a new line

edits a line

erases a line

DynamoDB Streams captures that event like a security camera.

Then it whispers to your Lambda function:

“Hey, someone added a new user!”

“Hey, someone updated their email!”

“Hey, someone deleted an item!”

Your Lambda can react instantly.

That’s the magic of event-driven architecture.

🛠️ Step 1: Building the Backend API

I started with the essentials:

✔ API Gateway endpoint → /register
✔ Lambda function → validates & stores user data
✔ DynamoDB table → Users
✔ Duplicate checks → no repeated emails/usernames

When a user registers, the API saves:

`{
  "name": "Alice",
  "email": "alice@gmail.com",
  "username": "al"
}
`
Enter fullscreen mode Exit fullscreen mode

and returns:

`{"message": "User registered successfully"}`
Enter fullscreen mode Exit fullscreen mode

Simple. Clean. Serverless.

🔄 Step 2: Turning on DynamoDB Streams

Next, I enabled Streams on the Users table with:

Stream type: NEW_AND_OLD_IMAGES

This means:

for INSERT → record the new item

for MODIFY → record both old + new

for DELETE → record the old one

Then I connected the Stream to a new Lambda:

📌 LogUserChangesFunction

This Lambda would “listen” to any change on the table.

⚙️ Step 3: Building the Stream Processor Lambda

Here’s the Lambda handler:

export const handler = async (event) => {
  console.log("=== DynamoDB Stream Event Received ===");
  for (const record of event.Records) {
    const type = record.eventName;
    try {
      if (type === "INSERT") {
        console.log("USER ADDED:", JSON.stringify(record.dynamodb.NewImage));
      } else if (type === "MODIFY") {
        console.log("USER UPDATED - OLD:", JSON.stringify(record.dynamodb.OldImage));
        console.log("USER UPDATED - NEW:", JSON.stringify(record.dynamodb.NewImage));
      } else if (type === "REMOVE") {
        console.log("USER DELETED:", JSON.stringify(record.dynamodb.OldImage));
      } else {
        console.log("UNKNOWN EVENT:", type, record);
      }
    } catch (err) {
      console.error("Error processing record:", err, record);
      throw err;
    }
  }
  return { statusCode: 200 };
};
Enter fullscreen mode Exit fullscreen mode

This tiny function became my “activity logger” for the entire database.

🧪 Step 4: Testing — Where the Learning Really Happened

This is where the real story lives.

📌 Mistake 1: Expecting Streams to Show Old Data

I added a new user named Alice before I turned on Streams.

Later, when I checked logs…
no event.

Because:

☑️ Streams only capture events that happen after you enable them.
❌ They do NOT show past historical changes.

📌 Mistake 2: Updating an Item Is Not an Insert

When I updated Eve’s username from toto → kichwa, I got:

USER UPDATED - OLD: ...
USER UPDATED - NEW: ...

Perfect.

📌 Mistake 3: IAM Is the Silent Gatekeeper

At first, I got errors like:

“Cannot access stream. Ensure GetRecords, ListStreams permissions…”

Streams won’t even fire unless IAM is correct.

Give Lambda:

`dynamodb:GetRecords

dynamodb:GetShardIterator

dynamodb:DescribeStream

dynamodb:ListStreams`
Enter fullscreen mode Exit fullscreen mode

After fixing that, everything worked flawlessly.

📈 Why This Architecture Is Actually Scalable

This simple system becomes extremely powerful because each part scales automatically:

1️⃣ API Gateway

Handles thousands of requests per second.

2️⃣ Lambda

Scales by spawning new instances based on traffic.

3️⃣ DynamoDB

Scales horizontally and supports millions of reads/writes per second.

4️⃣ Streams

Process events asynchronously without slowing down the main API.

Together, they form a backend that can grow from 10 users to 10 million without refactoring.

No servers.
No patching.
No worrying about capacity.

Just pure scalability.

🧩 What I Learned (The Real Takeaways)
✔ Serverless is not about writing less code

It’s about letting the platform handle scaling.

✔ Real scalability comes from loose coupling

Streams turn your system into Lego blocks that talk to each other.

✔ IAM is everything

You can write perfect code and still get blocked by one missing permission.

✔ Logs are your best debugging tool

Every DynamoDB event tells a story.

✔ Event-driven architecture is addictive

Once you start using Streams… you start seeing events everywhere.

🎤 Final Thoughts

When I finally gave the presentation to AWS User Group Pwani, everything worked smoothly — no errors, no surprises, just a clean demo.

But the real win was everything I learned before that moment.

I didn’t just build an API.
I built a scalable, event-driven backend using real AWS production tools.

More importantly:

I gained a deeper understanding of how modern backends should behave — reactive, decoupled, scalable, and effortless to maintain.

If you're curious about serverless, DynamoDB Streams is one of the best ways to start.
Simple idea → big power.

And who knows — your next small experiment might become your next talk or your next startup.

Top comments (0)