DEV Community

Pedram Hamidehkhan
Pedram Hamidehkhan

Posted on

Basic serverless CRUD Application with CDK for Terraform

In this post, we will build a basic serverless CRUD application on AWS using CDK for Terraform. In a previous post we created a simple hello world applicaiton, but in this one, we persist the data using dynamo db and we add some functionality to the applicaiton. Addiditionally we add some security to our API.

First thing's first, let's initialize a CDKTF project. Similar to Cloudformation CDK, you can run the following command to initialize the project:

cdktf init
Enter fullscreen mode Exit fullscreen mode

Then it will ask you for the language of your choice and name and description for your project. I opted for Typescript but you can use whatever language you are comfortable with.
The "cdktf init" command will create the project structure for us. As you might know, Terraform has a very large provider ecosystem. To continue with the project, you need to specify which provider you want to use. To do that, open the cdktf.json and change the "terraformProviders" section to:

  "terraformProviders": [
    "aws@>3.0"
  ]
Enter fullscreen mode Exit fullscreen mode

Of course, you can use the version of your choice. After that, you need to run the following command to pull down the providers:

cdktf get 
Enter fullscreen mode Exit fullscreen mode

OR:

 npm run get 
Enter fullscreen mode Exit fullscreen mode

Now we are ready to begin. In the main.ts file, add the following imports:

import { Construct } from "constructs";
import { App, AssetType, TerraformAsset, TerraformStack, TerraformOutput } from "cdktf";
import { lambdafunction, s3, apigateway, iam, AwsProvider, dynamodb } from "@cdktf/provider-aws";
import path = require("path");
Enter fullscreen mode Exit fullscreen mode

It is worth mentioning, that unlike Cloudformation CDK, you will get a compile error if you have unused imports or variables in your code.
Now add the following code to specify the provider:

new AwsProvider(this, "aws", {
  region: "eu-west-1"
});
Enter fullscreen mode Exit fullscreen mode

You also need to specify where your lambda function is. I created a folder called "src" and used TerraformAsset to import it as follows:

const asset = new TerraformAsset(this, "asset", {
  path: path.resolve(__dirname,'./src'),
  type:AssetType.ARCHIVE
});
Enter fullscreen mode Exit fullscreen mode

Now we need to create the S3 Bucket and Object to upload our TerraformAsset function:

const assetBucket = new s3.S3Bucket(this, "assetBucket", {
  bucket:"a-unique-bucket-name"
});

const lambdaArchive = new s3.S3BucketObject(this, "lambdaArchive", {
  bucket: assetBucket.bucket,
  key: asset.fileName,
  source: asset.path,
  sourceHash: asset.assetHash // to inform cdktf of changes in file
});
Enter fullscreen mode Exit fullscreen mode

Now we can create our database:

const db = new dynamodb.DynamodbTable(this, "db", {
  name: "my-table",
  billingMode: "PAY_PER_REQUEST",
  hashKey: "id",
  attribute: [
    {
      name: "id",
      type: "S"
    }
  ]
});
Enter fullscreen mode Exit fullscreen mode

Then create the IAM Policy and Role for the lambda function:

const lambPolicy = {
"Version": "2012-10-17",
"Statement": [
{
"Sid": "",
"Effect": "Allow",
"Principal": {
"Service": [
"lambda.amazonaws.com"
]
},
"Action": "sts:AssumeRole"
}
]
};

const role = new iam.IamRole(this, "role", {
  assumeRolePolicy: JSON.stringify(lambPolicy),
  name: "my-lambda-role"
});


new iam.IamRolePolicyAttachment(this, "rolePolicy", {
  policyArn: "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole",
  role: role.name
});

//we can use a managed policy to access our db
new iam.IamRolePolicyAttachment(this, "rolePolicyDB", {
  policyArn: "arn:aws:iam::aws:policy/AmazonDynamoDBFullAccess",
  role: role.name
});
Enter fullscreen mode Exit fullscreen mode

After that, we can specify the configuration of our lambda function:

const lambdaFunc = new lambdafunction.LambdaFunction(this, "lambdaFunc", {
  functionName: "my-lambda-function",
  runtime: "nodejs14.x",
  handler: "index.handler",
  role: role.arn,
  s3Bucket: assetBucket.bucket,
  s3Key: lambdaArchive.key,
  sourceCodeHash: lambdaArchive.sourceHash,
  environment: {
    variables: {
      "TABLE_NAME": db.name,
      "PRIMARY_KEY": 'itemId',
    }
  }
});
Enter fullscreen mode Exit fullscreen mode

I added the table name and PK as environment varaibles.

Then, we create an API Gateway to receive HTTP Requests from the Internet. We can also add the resources to our API and define HTTP Methods:

const restApi = new apigateway.ApiGatewayRestApi(this, "restApi", {
  name: "my-rest-api",
  description: "my-rest-api"
});

const resourceApi = new apigateway.ApiGatewayResource(this, "resourceApi", {
  restApiId: restApi.id,
  parentId: restApi.rootResourceId,
  pathPart: "my-resource",
});

const postApi = new apigateway.ApiGatewayMethod(this, "postApi", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: "POST",
  authorization: "NONE"
});

const getApi = new apigateway.ApiGatewayMethod(this, "getApi", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: "GET",
  authorization: "NONE"
});

const putApi = new apigateway.ApiGatewayMethod(this, "putApi", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: "PUT",
  authorization: "NONE"
});

const delApi = new apigateway.ApiGatewayMethod(this, "delApi", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: "DELETE",
  authorization: "NONE"
});
Enter fullscreen mode Exit fullscreen mode

We also need to add API integrations to our methods as follows:

new apigateway.ApiGatewayIntegration(this, "apiIntegration", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: postApi.httpMethod,      
  integrationHttpMethod: "POST",
  type: "AWS_PROXY",
  uri: lambdaFunc.invokeArn
});

new apigateway.ApiGatewayIntegration(this, "apiIntegration2", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: getApi.httpMethod,      
  integrationHttpMethod: "POST",
  type: "AWS_PROXY",
  uri: lambdaFunc.invokeArn
});

new apigateway.ApiGatewayIntegration(this, "apiIntegration3", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: putApi.httpMethod,
  integrationHttpMethod: "POST",
  type: "AWS_PROXY",
  uri: lambdaFunc.invokeArn
});

new apigateway.ApiGatewayIntegration(this, "apiIntegration4", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: delApi.httpMethod,
  integrationHttpMethod: "POST",
  type: "AWS_PROXY",
  uri: lambdaFunc.invokeArn
});
Enter fullscreen mode Exit fullscreen mode

Note that integrationHttpMethod is always set to POST:

Finally, you need to create a LambdaPermission Policy to allow the API Gateway to invoke our Lambda Function:

new lambdafunction.LambdaPermission(this, "apig-lambda", {
  statementId: "AllowExecutionFromAPIGateway",
  action: "lambda:InvokeFunction",
  functionName: lambdaFunc.functionName,
  principal: "apigateway.amazonaws.com",
  sourceArn: `${restApi.executionArn}/*/*`      
});
Enter fullscreen mode Exit fullscreen mode

Now we have our API and Lambda Function Integrations. We can go ahead and deploy our API into a Stage. However, we should secure our API, because at this point, it will be reachable from the Internet.

So we create a deployment for our API and add a stage:

const apiDepl = new apigateway.ApiGatewayDeployment(this, "deployment", {
  restApiId: restApi.id,
  dependsOn: [lambdaFunc]
});

const apiStage = new apigateway.ApiGatewayStage(this, "stage", {
  restApiId : restApi.id,
  stageName: "test",
  deploymentId: apiDepl.id,
  dependsOn: [apiDepl]
});
Enter fullscreen mode Exit fullscreen mode

I also added the "dependsOn" property to make sure certain resources will be created before others.

Now we can create an API Key for our API:

const apiKey = new apigateway.ApiGatewayApiKey(this, "apiKey", {
  name: "my-api-key",
  description: "my-api-key",
  enabled: true
});
Enter fullscreen mode Exit fullscreen mode

After that, we need to create a usage plan and attach the api key to it. The Usage plan gives you a lot of control over how your API is used.

const usagePlan = new apigateway.ApiGatewayUsagePlan(this, "usagePlan", {
  name: "my-usage-plan",
  description: "my-usage-plan",
  throttleSettings: {
    burstLimit: 10,
    rateLimit: 10
  },
  apiStages: [
    {
      apiId: restApi.id,
      stage: apiStage.stageName
    }
  ],
  dependsOn: [apiKey]
});

new apigateway.ApiGatewayUsagePlanKey(this, "usagePlanKey", {
  keyId: apiKey.id,
  keyType: "API_KEY",
  usagePlanId: usagePlan.id,
  dependsOn:[usagePlan]
});
Enter fullscreen mode Exit fullscreen mode

Note that I limited the rate and burst limit to make sure the costs will be in control.

If you like to see the endpoint of your API and test it curl or postman, you can use the following command:

new TerraformOutput(this, "apiUrl", {
  value: apiStage.invokeUrl,
  description: "API URL"
});
Enter fullscreen mode Exit fullscreen mode

Don't forget to add your lambda function in the "src" folder; create a file called index.ts and add the functionality that you want. I only wanted to send a hello world response:

import * as AWS from 'aws-sdk';
import { v4 as uuidv4 } from 'uuid';

const db = new AWS.DynamoDB.DocumentClient({ apiVersion: '2012-08-10', region: 'eu-west-1' });
const TABLE_NAME = process.env.TABLE_NAME || '';
const PRIMARY_KEY = process.env.PRIMARY_KEY || '';

export const handler = async (event: any = {}): Promise<any> => {
  if (event.httpMethod === 'POST') {
    try {
      if (!event.body) {
        return { statusCode: 400, body: 'invalid request, you are missing the parameter body' };
      }
      const item = typeof event.body == 'object' ? event.body : JSON.parse(event.body);
      item[PRIMARY_KEY] = uuidv4();
      const params = {
        TableName: TABLE_NAME,
        Item: item
      };

      return await db.put(params).promise();
    } catch (error) {
      return { statusCode: 500, body: error };
    }
  }
  else if (event.httpMethod === 'GET') {
    //get all items
    try {
      const params = {
        TableName: TABLE_NAME
      };
      const resp = await db.scan(params).promise()
      return { statusCode: 200, body: JSON.stringify(resp.Items) };
    } catch (error) {
      return { statusCode: 500, body: error };
    }
  }

  else if (event.httpMethod === 'PUT') {
    try {
      if (!event.body) {
        return { statusCode: 400, body: 'invalid request, you are missing the parameter body' };
      }
      const item = typeof event.body == 'object' ? event.body : JSON.parse(event.body);
      const params = {
        TableName: TABLE_NAME,
        Item: item
      };
      return await db.put(params).promise();
    } catch (error) {
      return { statusCode: 500, body: error };
    }
  }
  else if (event.httpMethod === 'DELETE') {
    try {
      if (!event.pathParameters) {
        return { statusCode: 400, body: 'invalid request, you are missing the parameter pathParameters' };
      }
      const params = {
        TableName: TABLE_NAME,
        Key: {
          [PRIMARY_KEY]: event.pathParameters[PRIMARY_KEY]
        }
      };
      return await db.delete(params).promise();
    } catch (error) {
      return { statusCode: 500, body: error };
    }
  }
};
Enter fullscreen mode Exit fullscreen mode

I will explain this lambda function in a separate post.
You can create this lambda function without packaging it. But I'd like to package it before deploying to make sure that I have all the dependencies that I need:
So I create a file called "package.json" and add the following:

{
    "name": "lambda-crud-dynamodb",
    "version": "1.0.0",
    "description": "Lambdas to do CRUD operations on DynamoDB",
    "private": true,
    "license": "MIT",
    "devDependencies": {
      "@types/node": "*",
      "@types/uuid": "*"
    },
    "dependencies": {
      "aws-sdk": "*",
      "uuid": "*"
    }
}
Enter fullscreen mode Exit fullscreen mode

Regarding deploying this code, you will find your way if you have used CDK Cloudformation or Terraform HCL.
You can run the following commands to deploy the project:

npm run build
cdktf plan OR cdktf synth
cdktf apply OR cdktf deploy
Enter fullscreen mode Exit fullscreen mode

The youtube video: https://youtu.be/sf2EgmHRHiw

Github: https://github.com/pedramha/cdktf-aws-example

Top comments (0)