DEV Community

Cover image for Deploy a simple data storage API with very little code using Amazon API Gateway and DynamoDB
Paul SANTUS for AWS Community Builders

Posted on • Updated on

Deploy a simple data storage API with very little code using Amazon API Gateway and DynamoDB

The AWS serverless ecosystem includes multiples services that architects and developers can leverage to build micro-services applications: storage (S3, DynamoDB), integration components (API Gateway, Eventbridge, SQS…) and, of course, the ability to execute custom code in Lambda functions.

But in some cases, you might not even need a Lambda function to process queries! In this blog post, I’ll demonstrate how to use Amazon API Gateway direct integration to cut costs (and reduce the amount of glue code you need to write) and still perform transformation between data storage and presentation. You’ll find a fully-functional code sample at the end of this blog post.

A real life use case

This blog post actually comes from a real-life use case. In France, since this year, energy suppliers are legally required to offer their needy customers a device that provides real-time access to 5s-sampled power measurement data of the previous 24 hours.

To energy suppliers, this was quite a shock, since power metering used to be the thing of Distribution System Operators and skills in IoT were scarce, or even non-existent. We were faced with the obligation to ingest massive amount of data (6.3m samples per year per meter), most of which had basically zero value (most of the data would never ever be read by customers). At our own cost.

When faced with such requirements, FinOps becomes key to designing your system. I came up with a design that relies on API Gateway to receive the data and DynamoDB to store it (and flush it, thanks to DynamoDB’s Time-to-Live feature).

I won’t go into the specifics of that design, since it is proprietary to my company. Let’s just assert here that it is both cost-effective and - a nice quality for such a system - infinitely scalable with linear costs.

Instead, you’ll find below a fully-functional example of a “Pets API” (create, then retrieve pets by name, gender, race etc.)

Direct integration, the VTL magic

One quality of the system is also that, even though we process the incoming data, there is no Lambda function involved. The additional cost of Lambda would have made the total cost of ownership twice as high.

So, how do you perform data transformation between an API Gateway and a DynamoDB, with no Lambda function? The answer is: Apache Velocity Template Language (aka. VTL).

API Gateway processing works in 5 steps:

  1. Method Request handles authentication and input checks,

  2. then the query is passed on to the backend service through an Integration Request,

  3. then backend service performs it own logic

  4. then the service response is sent back throught the Integration Response component

  5. and eventually a response is rendered to the client via the Method Response component.

Explanation of how API Gateway works, from AWS’blog post on machine learning [https://aws.amazon.com/fr/blogs/machine-learning/creating-a-machine-learning-powered-rest-api-with-amazon-api-gateway-mapping-templates-and-amazon-sagemaker/](https://aws.amazon.com/fr/blogs/machine-learning/creating-a-machine-learning-powered-rest-api-with-amazon-api-gateway-mapping-templates-and-amazon-sagemaker/)

Both the Integration Request **and **Integration Response components allow to perform quite advanced data transformation using the Apache Velocity Template language. In this language, you can set variables and store data structures, loop/iterate over the elements of a list, etc.

Below, you’ll find the code that transforms the body of the HTTP POST call that creates a Pet.


This transforms an input like[{"owner":"Paul", "name":"milou"},{"owner":"Peter", "name":"rex", "gender":"M", "age":13}]to the body that the DynamoDB API’s BatchWriteItems operation expects.

A fully functional example

In my Github repository, you will find the Terraform code that will enable you to deploy a fully-functional “Pets API” in a matter of seconds. In particular, you might be interested in

The AWS full documentation to perform data transformation using VTL is available here.

Top comments (0)