DEV Community

Introduction to AWS Step Functions Using Terraform as Infrastructure-as-Code Tool

Demo in spanish

demo

Introduction

Microservices and serverless architectures are booming in the current information technology landscape. However, coordinating and handling errors among these services can be a challenge. This is where AWS Step Functions come into play, as they provide a robust workflow tool to orchestrate microservices components and handle workflows in AWS.

In this article, we will introduce the concept of AWS Step Functions, explaining their usefulness and advantages in creating cloud-based applications. Then, as a practical case, we will show how to implement and manage AWS Step Functions using Terraform, a very popular infrastructure-as-code tool. The purpose of this article is to provide a clear understanding of AWS Step Functions and their implementation through Terraform, so you can make the most of this service and enhance the efficiency and robustness of your applications.

What are AWS Step Functions

AWS Step Functions is a fully managed workflow orchestration service that makes it easy to coordinate the components of distributed applications. This service allows developers to visually design workflows, or 'step functions,' which coordinate the components of their applications in a specific pattern, such as sequences, branches, and merges.

Features of AWS Step Functions

State Management

AWS Step Functions keep track of the state of each workflow, maintaining its activity and data at all stages.

Retries and Error Handling

Provides automatic error handling and retries.

Visualization

Offers a graphical interface to visualize and modify workflows.

Compatibility

AWS Step Functions can interact with almost all other AWS services.

Programming

Developers can program the coordination and conditional logic in their application, instead of implementing it in the code.

Advantages of Using AWS Step Functions

Resilience

AWS Step Functions has built-in error handling capabilities, making workflows resilient to failures.

Scalability

As a managed service, AWS Step Functions can scale as needed to execute workflows.

Reduced Coding

AWS Step Functions eliminates the need to write 'glue' code to coordinate microservices.

Easy to Monitor

AWS Step Functions is integrated with CloudWatch, allowing easy monitoring of workflows.

Disadvantages of Using AWS Step Functions

Cost

While AWS Step Functions can reduce the amount of code you need to write, it is not free. The cost can add up quickly for complicated or high-volume workflows.

Complexity

AWS Step Functions introduces a new level of complexity to the application, as the Step Functions service must now be managed and understood.

Time Limitations

Each execution of a step function has a maximum duration of one year. This may not be suitable for some long-term workflows.

Vendor Lock-in

By using AWS Step Functions, you are locking yourself into the AWS platform. If you ever need to migrate to another platform in the future, this could be a limiting factor.

What is Terraform

Terraform is an open-source Infrastructure as Code (IaC) tool created by HashiCorp. It allows developers to define and provide data center infrastructure using a declarative configuration language. This includes servers, storage, and networking across a variety of cloud service providers, including AWS.

Advantages of Using Terraform

Cloud-Agnostic Platform

Unlike provider-specific IaC tools like AWS CloudFormation, Terraform is cloud-agnostic, meaning it can work with any cloud service provider, including AWS, Google Cloud, Azure, and others.

Declarative Configuration Language

Terraform uses a declarative configuration language, meaning you specify what resources you want to create without having to detail the stages to create them.

State Management

Terraform maintains a state of your infrastructure and can determine what has changed since the last execution, allowing for efficient planning of changes.

Disadvantages of Using Terraform

Complexity

Although Terraform can be very powerful, it can also be complicated to configure and use correctly.

Documentation

Sometimes, the documentation for Terraform can be lacking or confusing, especially for more complex use cases.

Deployment Speed

Terraform may be slower to support new features of cloud services compared to provider-specific IaC tools.

Basic Example in Terraform "Hello World"

Here is a basic example of what a Terraform script would look like to create a single EC2 server in AWS. This is equivalent to a 'Hello World' in Terraform:

provider "aws" {
    region = "us-west-2"
}

resource "aws_instance" "example" {
    ami           = "ami-0c55b159cbfafe1f0"
    instance_type = "t2.micro"

    tags = {
        Name = "example-instance"
    }
}
Enter fullscreen mode Exit fullscreen mode

This is a very simple script. Here's what's happening:

  • provider "aws": This specifies that we are going to use AWS as our provider for our resource. The region is specified within this block.
  • resource "aws_instance" "example": This defines a resource, in this case, an EC2 instance. "example" is an arbitrary name we give to this resource.
  • Inside the resource block, we specify the Amazon Machine Image (AMI) ID we want to use for our instance and the instance type. In this case, we are using an AMI for a basic Ubuntu instance and t2.micro for the instance type, which is the lowest cost option.
  • The tags block allows adding labels to the instance, in this case, simply giving the instance the name "example-instance".

To run this script, you must have Terraform installed and AWS credentials configured in your environment. Then you can initialize Terraform with terraform init, plan the execution with terraform plan, and finally apply the changes with terraform apply.

Architecture of the Example to Follow

In this section, we will look at how we can implement a simple step flow that can be implemented with the AWS Step Functions service.

General Architecture Example AWS Step Functions
General Architecture Example AWS Step Functions

The flow starts with the execution of a lambda that creates a random number between 1 and 100. This number will be validated by the flow; if the number is even, the lambda "Even" will be executed, and if it is odd, the lambda "Odd" will be executed. After this, we end the flow.

To make the example functional, we must carry out the following steps:

  1. Create the Logic of the Lambda “Number Generator”: This lambda will simply generate a random number between 1 and 100 and then return that generated number as a response. This lambda will be developed in Node 14.x.
  2. Create the Logic of the Lambda “Even”: This lambda will receive the previously generated number as an input parameter and will print a message in the logs specifying that the number is even. Like the previous lambda, the logic will be developed in Node 14.x.
  3. Create the Logic of the Lambda “Odd”: This lambda will receive the previously generated number as an input parameter and will print a message in the logs specifying that the number is odd.
  4. Create the Base Code of the Terraform Project: We are going to create an initial Terraform project by importing the AWS provider.
  5. Generate a Packaging Mechanism for Each of the Lambdas: In order to upload this logic as a lambda function to AWS, it is necessary that they be packaged as .zip files. We will do this through Terraform.
  6. Create Infrastructure for the 3 Lambdas through Terraform: In the code base created in Terraform, we will add the infrastructure for the 3 lambdas mentioned above and associate the logic code of each of them.
  7. Create Infrastructure for the Step Function That We Will Use in the Terraform Project: In the code base created in Terraform, we will implement the necessary infrastructure resources to create the step function considering the flow outlined above.
  8. Create Infrastructure Components That Allow the Step Function to Execute the Lambdas: It is necessary to add a role to the step function so that it can execute the lambdas specified above.
  9. Test the Created Step Function!

Implementation of the Example

Now that we have explained the example that we are going to carry out with AWS Step Functions, let's implement each step.

Create the Logic of the Lambda "Number Generator"

For the lambda we're creating, we will make a folder where we will store its logic. In this folder, we will have the files package.json and main.js.

Initial Folder Structure for Lambdas
Initial Folder Structure for Lambdas

The package.json file will contain the following:

{
    "name": "number-generator",
    "version": "1.0.0",
    "description": "Number generator lambda",
    "main": "main.js",
    "dependencies": {},
    "devDependencies": {
        "aws-sdk": "^2.1045.0"
    }
}
Enter fullscreen mode Exit fullscreen mode

The main.js file will have the logic to generate a random number and return it, along with the validation of whether the number is even or not:

exports.handler = async event => {
    var number = Math.floor(Math.random() * 100) + 1;
    console.log(`Generated number is: ${number}`);
    return { number: number, is_even: number % 2 === 0 };
};
Enter fullscreen mode Exit fullscreen mode

Create the Logic of the Lambda Even

For the logic of the lambda even, we will do the same as for the previous lambda, having a package.json and main.js file.

package.json:

{
    "name": "even number",
    "version": "1.0.0",
    "description": "Even number lambda",
    "main": "main.js",
    "dependencies": {},
    "devDependencies": {
        "aws-sdk": "^2.1045.0"
    }
}
Enter fullscreen mode Exit fullscreen mode

main.js:

exports.handler = async event => {
    console.log(`My even number is: ${event.number}`);
    return { number: event.number };
};
Enter fullscreen mode Exit fullscreen mode

Create the Logic of the Lambda Odd

Lastly, the files for the lambda Odd:

package.json:

{
    "name": "odd number",
    "version": "1.0.0",
    "description": "Odd number lambda",
    "main": "main.js",
    "dependencies": {},
    "devDependencies": {
        "aws-sdk": "^2.1045.0"
    }
}
Enter fullscreen mode Exit fullscreen mode

main.js:

exports.handler = async event => {
    console.log(`My odd number is: ${event.number}`);
    return { number: event.number };
};
Enter fullscreen mode Exit fullscreen mode

For creating the logic of the 3 lambdas, it's important to consider that:

  • The aws-sdk dependency was added; for this example, it won't be used but is included to illustrate how we can import AWS's own libraries to interact with the services.
  • It's crucial that in each lambda folder we run the command npm i, so that the dependencies we're adding to each of the lambdas can be installed.
  • The final folder structure should look similar to this:

Final folder structure for the lambda logic
Final folder structure for the lambda logic

Creating the Base Code for the Terraform Project

Similar to the lambda logic, we need a folder to store our entire Terraform project that will help us create the infrastructure components necessary for the example to function. Initially, we'll have a folder named terraform and a file called main.tf:

Initial Folder Structure for the Terraform Project
Figure: Initial Folder Structure for the Terraform Project

In the main.tf file, we define the providers we will use to create our infrastructure components. A provider in Terraform is a collection of resources we can leverage to configure and manage components:

  • aws: This provider supplies all the infrastructure components we can utilize from Amazon Web Services (AWS).
  • archive: This provider provides utilities for file handling. We will use it to generate the final files for each of the Lambda functions we've been creating.
provider "aws" {
    version = "~> 3.0"
    region  = "us-east-1"
}

provider "archive" {}
Enter fullscreen mode Exit fullscreen mode

Now, we should initialize the project. For this, we execute the command terraform init:

Terraform Project Initialized

This way, we have our Terraform project's base ready for starting the creation of infrastructure components.

Generating a Packaging Mechanism for Each Lambda

The logic for each lambda must be packaged into a .zip file to upload it to AWS. For this, we'll use the archive provider.

For this, we'll create a new file in the Terraform project called lambda_resources.tf.

data "archive_file" "number_generator" {
    type        = "zip"
    source_dir  = "../lambdas/1_number_generator/"
    output_path = "../lambdas/dist/1_number_generator.zip"
}

data "archive_file" "even" {
    type        = "zip"
    source_dir  = "../lambdas/2_even/"
    output_path = "../lambdas/dist/2_even.zip"
}

data "archive_file" "odd" {
    type        = "zip"
    source_dir  = "../lambdas/3_odd/"
    output_path = "../lambdas/dist/3_odd.zip"
}
Enter fullscreen mode Exit fullscreen mode

Each code block uses the archive_file provider to create a zip file from a source directory. Let's break down each line:

  • data archive_file number_generator: This line defines a new archive_file resource called number_generator. The word data indicates that we are obtaining data from an existing resource, rather than creating a new one.
  • type = zip: This line specifies the type of output file we want. In this case, we are creating a zip file.
  • source_dir = ../lambdas/1_number_generator/: This line specifies the source directory we want to compress. All files and subdirectories within ../lambdas/1_number_generator/ will be included in the zip file.
  • output_path = ../lambdas/dist/1_number_generator.zip: This line specifies the path and name of the output file. The resulting zip file will be saved in ../lambdas/dist/1_number_generator.zip.

This type of operation is quite common in serverless applications like AWS Lambda, where you need to upload your function code in zip format to AWS. Therefore, this code block helps you prepare your Lambda function for deployment, packaging the function code into a zip file that can then be uploaded to AWS.

To see the result of this, we run terraform apply. After this, we will see that the packaged lambdas have already been created:

Packaged Lambdas
Figure: Packaged Lambdas

Creating the Infrastructure for the 3 Lambdas Using Terraform

Now that we have the logic of the lambdas generated and ready to upload to AWS, we need to create the infrastructure in the terraform project. For this, we need to:

  • Create an IAM role to be used by the lambdas, which is necessary to associate permissions with each of them.
  • Create the infrastructure resources for each lambda and link them to the previously created role.

IAM Role for the Lambdas

For this step, we will create a new file called iam.tf with the following content:

resource "aws_iam_role" "example_lambda_role" {
    name               = "example_lambda_role_for_numbers"
    assume_role_policy = <<EOF
    {
    "Version": "2012-10-17",
    "Statement": [
        {
        "Action": "sts:AssumeRole",
        "Principal": {
            "Service": "lambda.amazonaws.com"
        },
        "Effect": "Allow",
        "Sid": ""
        }
    ]
}
    EOF
}

In this code segment, we have:

  • resource aws_iam_role" "example_lambda_role: This line is declaring a Terraform resource of type aws_iam_role (an IAM role in AWS) with the local name example_lambda_role. This local name is what you would use within your Terraform configuration to refer to this resource.
  • name = example_lambda_role_for_numbers: This is the name the IAM role will have in AWS.
  • assume_role_policy = EOF: This is a policy document that defines which entities are allowed to assume the role. In this case, AWS's Lambda service.

The policy itself is a JSON document containing a list of statements, and each statement defines a rule. In this case, there's a single statement:

  • Action: sts:AssumeRole: This action allows entities to assume the role.
  • Principal: Service: lambda.amazonaws.com: This is the entity that is allowed to assume the role. In this case, it's the AWS Lambda service.
  • Effect: Allow: This is the decision of the policy. In this case, it's allowing (Allow) the action.
  • Sid: This is the Statement ID of the policy. In this case, it's empty, but you can use it to give a unique identifier to each statement.

This code is essentially creating an IAM role that allows AWS Lambda functions to assume this role. This is a common pattern in AWS when you want to allow your Lambda functions to interact with other AWS services.

Lambda Infrastructure Resources

For this step, we will create a file called lambda.tf which will contain the following content:

resource "aws_lambda_function" "number_generator_lambda" {
    filename         = data.archive_file.number_generator.output_path
    source_code_hash = data.archive_file.number_generator.output_base64sha256
    function_name    = "poc_number_generator_lambda"
    role             = aws_iam_role.example_lambda_role.arn
    handler          = "1_number_generator/main.handler"
    runtime          = "nodejs18.x"
}

resource "aws_lambda_function" "even_lambda" {
    filename         = data.archive_file.even.output_path
    source_code_hash = data.archive_file.even.output_base64sha256
    function_name    = "poc_even_lambda"
    role             = aws_iam_role.example_lambda_role.arn
    handler          = "2_even/main.handler"
    runtime          = "nodejs18.x"
}

resource "aws_lambda_function" "odd_lambda" {
    filename         = data.archive_file.odd.output_path
    source_code_hash = data.archive_file.odd.output_base64sha256
    function_name    = "poc_odd_lambda"
    role             = aws_iam_role.example_lambda_role.arn
    handler          = "3_odd/main.handler"
    runtime          = "nodejs18.x"
}

This code snippet is an example of how to use Terraform to create an AWS Lambda function. AWS Lambda is a service that allows you to run your code without provisioning or managing servers. You just upload your code (known as a Lambda function), and Lambda takes care of the rest.

We will analyze each line of the code to better understand what it is doing:

  • resource aws_lambda_function odd_lambda: This line declares a Terraform resource of type aws_lambda_function with the local name odd_lambda. Terraform uses this local name to refer to this resource in other parts of the configuration.
  • filename = data.archive_file.odd.output_path: This line specifies the path of the zip file containing the code for the Lambda function. In this case, a data archive file resource is used to generate this zip file. data.archive_file.odd.output_path refers to the output path of the generated file.
  • source_code_hash = data.archive_file.odd.output_base64sha256: This is a hash of the Lambda function's source code. Terraform uses this hash to determine if the source code has changed and if it needs to redeploy the Lambda function.
  • function_name = poc_odd_lambda: This is the name that the Lambda function will have in AWS.
  • role = aws_iam_role.example_lambda_role.arn: This is the ARN (Amazon Resource Name) of the IAM role that the Lambda function will assume when executed. In this case, it refers to the IAM role created in the previous example.
  • handler = 3_odd/main.handler: This is the handler of the Lambda function, which is the function in your code that Lambda calls when the function is executed. The format is file.function. In this case, 3_odd/main.handler means that Lambda will call the handler function in the main file of the odd directory.
  • runtime = nodejs18.x: This is the runtime environment in which the Lambda function will execute. In this case, the function will run in a Node.js 18.x environment.

In summary, this Terraform code is creating a Lambda function that will execute code located at the specified path in a Node.js 18.x environment, will assume a specific IAM role when executed, and will have a specific name in AWS.

To test this, we will run the command terraform apply and then confirm with yes:

Resources created
Caption: Resources created
Label: fig:architecture_aws_step_function

Next, if we go to the AWS console, we will see the Lambdas already created:

Resources Created
Caption: Resources created
Label: fig:architecture_aws_step_function

Creating Infrastructure Components to Enable Step Function to Execute Lambdas

To allow the previously created Lambdas to be executed by a Step Function, it is necessary to create a role that the Step Function can assume to execute the Lambdas. For this, we will create the following resources inside the iam.tf file:

  • resource aws_iam_role step_functions_role: Role that the Step Function will assume.
  • data aws_iam_policy_document lambda_access_policy: IAM Policy that will be associated with the Step Function's role.
  • resource aws_iam_policy step_functions_policy_lambda: IAM Policy resource to associate with the Step Function's role.
  • resource aws_iam_role_policy_attachment step_functions_to_lambda: Explicit association of the IAM Policy with the Role.

resource aws_iam_role step_functions_role

resource "aws_iam_role" "step_functions_role" {
    name = "step_functions_role_poc_sf"

    assume_role_policy = jsonencode({
        Version = "2012-10-17"
        Statement = [
        {
            Action = "sts:AssumeRole"
            Effect = "Allow"
            Principal = {
            Service = "states.amazonaws.com"
            }
        }
        ]
    })
}
Enter fullscreen mode Exit fullscreen mode

data aws_iam_policy_document lambda_access_policy

data "aws_iam_policy_document" "lambda_access_policy" {
    statement {
        actions = [
        "lambda:*"
        ]
        resources = ["*"]
    }
}
Enter fullscreen mode Exit fullscreen mode

resource aws_iam_policy step_functions_policy_lambda

resource "aws_iam_policy" "step_functions_policy_lambda" {
    name   = "step_functions_policy_lambda_policy_all_poc_sf"
    policy = data.aws_iam_policy_document.lambda_access_policy.json
}
Enter fullscreen mode Exit fullscreen mode

resource aws_iam_role_policy_attachment step_functions_to_lambda

resource "aws_iam_role_policy_attachment" "step_functions_to_lambda" {
    role       = aws_iam_role.step_functions_role.name
    policy_arn = aws_iam_policy.step_functions_policy_lambda.arn
}
Enter fullscreen mode Exit fullscreen mode

Creating the Step Function Infrastructure for Our Terraform Project

In this section, we will create the infrastructure components that will allow us to set up a Step Function interacting with the previously created Lambdas.

For this, we will create a file named step_function.tf in our Terraform project with the following content:

# Step Functions State Machine
resource "aws_sfn_state_machine" "number_processor_sf" {
    name     = "NumberProcessorSF"
    role_arn = aws_iam_role.step_functions_role.arn

    definition = <<EOF
{
    "Comment": "execute lambdas",
    "StartAt": "NumberGenerator",
    "States": {
    "NumberGenerator": {
        "Type": "Task",
        "Resource": "${aws_lambda_function.number_generator_lambda.arn}",
        "Next": "IsNumberEven"
    },
    "IsNumberEven": {
        "Type": "Choice",
        "Choices": [
        {
            "Variable": "$.Payload.is_even",
            "BooleanEquals": true,
            "Next": "Even"
        }
        ],
        "Default": "Odd"
    },
    "Even": {
        "Type": "Task",
        "Resource": "${aws_lambda_function.even_lambda.arn}",
        "End": true
    },
    "Odd": {
        "Type": "Task",
        "Resource": "${aws_lambda_function.odd_lambda.arn}",
        "End": true
    }
    }
}
EOF
}
Enter fullscreen mode Exit fullscreen mode

To apply changes to the cloud just like in previous steps, simply run a terraform apply.

Testing the Created Step Function!

In the Amazon console, we should navigate to the Step Functions service and observe that there is a step function named: NumberProcessorSF.

Step Function Details
Detail of the Created Step Function

In the definition section, we can see a diagram with the defined flow, which is similar to the architecture defined at the beginning of the exercise.

Definition of the Created Step Function
Definition of the Created Step Function

Now, to test it, we go to the Executions tab and click the Start Execution button:

Execute the Created Step Function
Execute the Created Step Function

In this view, we can see the flow of our execution and each of the steps taken to complete the execution. Additionally, if we click on any of the steps, we can see logs, inputs, and outputs of what is happening, thus facilitating the traceability of each execution.

This is what a successful execution looks like:

Successful Execution
Successful Execution

Informative Links

Top comments (0)