DEV Community

Tammura
Tammura

Posted on • Originally published at awstip.com on

Trigger AWS Step Functions from Amazon S3 Events Using SAM

In this post, I’ll show you how to trigger an AWS Step Functions workflow when a file is uploaded to an S3 bucket. We’ll use the AWS Serverless Application Model (SAM) to automate the deployment and make the process as straightforward as possible.

By default, I’ll walk you through creating a new S3 bucket and setting up the trigger. If you already have an existing bucket, I’ll also include instructions for integrating it with Step Functions.

Prerequisites

Before we start, make sure you have:

  • AWS CLI installed and configured.
  • AWS SAM CLI installed.
  • A basic understanding of AWS Lambda , S3 , and Step Functions.

Architecture Overview


Workflow to trigger AWS Step Functions from Amazon S3 events using SAM and Amazon EventBridge.

Key Concepts

  1. Step Functions Workflow : A serverless workflow to orchestrate multiple AWS services.
  2. EventBridge : Captures S3 events and triggers workflows based on defined rules.
  3. SAM Template : Defines the infrastructure in YAML format.

Step-by-Step Implementation

1. Define the SAM Template

We’ll start by creating a SAM template that sets up a new S3 bucket and configures it to trigger a Step Functions workflow when a file is uploaded. Here’s the template:

AWSTemplateFormatVersion: "2010-09-09"
Transform: AWS::Serverless-2016-10-31

Description: "Trigger Step Functions from S3 Events"

Resources:
  # Create a new S3 bucket
  MyS3Bucket:
    Type: AWS::S3::Bucket
    Properties:
      NotificationConfiguration:
        EventBridgeConfiguration:
          EventBridgeEnabled: true

  # IAM Role for Step Functions
  StepFunctionRole:
    Type: AWS::IAM::Role
    Properties:
      AssumeRolePolicyDocument:
        Version: "2012-10-17"
        Statement:
          - Effect: Allow
            Principal:
              Service: states.amazonaws.com
            Action: sts:AssumeRole
      Policies:
        - PolicyName: StepFunctionPolicy
          PolicyDocument:
            Version: "2012-10-17"
            Statement:
              - Effect: Allow
                Action:
                  - lambda:InvokeFunction
                Resource: "*"

  # EventBridge Rule to trigger Step Functions
  S3UploadTriggerRule:
    Type: AWS::Events::Rule
    Properties:
      EventPattern:
        source:
          - aws.s3
        detail-type:
          - "Object Created"
        detail:
          bucket:
            name:
              - !Ref MyS3Bucket
        # Optional: Filter by file suffix (e.g., ".zip")
        # Remove or modify this section to trigger for all file types
          object:
            key:
              - suffix: ".zip"
      State: ENABLED
      Targets:
        - Id: StepFunctionTarget
          Arn: !Ref StepFunction
          RoleArn: !GetAtt StepFunctionRole.Arn

  # Step Functions State Machine
  StepFunction:
    Type: AWS::Serverless::StateMachine
    Properties:
      DefinitionString: |
        {
          "Comment": "A simple state machine",
          "StartAt": "Pass",
          "States": {
            "Pass": {
              "Type": "Pass",
              "End": true
            }
          }
        }
      Role: !GetAtt StepFunctionRole.Arn

Outputs:
  S3BucketName:
    Description: "Name of the newly created S3 bucket"
    Value: !Ref MyS3Bucket
  StepFunctionArn:
    Description: "ARN of the Step Function"
    Value: !Ref StepFunction
Enter fullscreen mode Exit fullscreen mode

Optional: Using an Existing S3 Bucket

If you already have an S3 bucket and want to use it instead of creating a new one, follow these steps:

  1. Enable EventBridge Notifications :
  • Go to the AWS Management Console.
  • Navigate to the S3 service.
  • Select your existing bucket.
  • Under the Properties tab, find Event Notifications.
  • Edit the Amazon EventBridge section and turn on notifications.
  • Save the changes.

2. Modify the SAM Template :

Replace the MyS3Bucket resource with a parameter for the existing bucket name:

Parameters:
  ExistingBucketName:
    Type: String
    Description: "The name of the existing S3 bucket."

Resources:
  S3UploadTriggerRule:
    Type: AWS::Events::Rule
    Properties:
      EventPattern:
        source:
          - aws.s3
        detail-type:
          - "Object Created"
        detail:
          bucket:
            name:
              - !Ref ExistingBucketName
          object:
            key:
              # Optional: Filter by file suffix (e.g., ".zip")
              # Remove or modify this section to trigger for all file types
              - suffix: ".zip"
      State: ENABLED
      Targets:
        - Id: StepFunctionTarget
          Arn: !Ref StepFunction
          RoleArn: !GetAtt StepFunctionRole.Arn
Enter fullscreen mode Exit fullscreen mode

Deploy the Template :

Run the deployment command and provide the name of your existing bucket when prompted or SAM will create a new S3 bucket :

sam build
sam deploy --guided
Enter fullscreen mode Exit fullscreen mode

3. Test the Setup

  1. Upload a file to your S3 bucket (new or existing):
aws s3 cp sample.zip s3://<your-bucket-name>/
Enter fullscreen mode Exit fullscreen mode

Go to the AWS Step Functions console to verify that your workflow has been triggered.

Conclusion

In this post, we’ve seen how to configure an S3 event to trigger a Step Functions workflow using AWS SAM. By default, we created a new S3 bucket , but I’ve also included instructions for using an existing bucket as an optional step. This approach combines simplicity with scalability, allowing you to automate workflows effectively.

If you have any questions or suggestions, drop a comment below!


Image of Timescale

🚀 pgai Vectorizer: SQLAlchemy and LiteLLM Make Vector Search Simple

We built pgai Vectorizer to simplify embedding management for AI applications—without needing a separate database or complex infrastructure. Since launch, developers have created over 3,000 vectorizers on Timescale Cloud, with many more self-hosted.

Read more →

Top comments (0)

Billboard image

The Next Generation Developer Platform

Coherence is the first Platform-as-a-Service you can control. Unlike "black-box" platforms that are opinionated about the infra you can deploy, Coherence is powered by CNC, the open-source IaC framework, which offers limitless customization.

Learn more