DEV Community

Cover image for AWS Lambda and S3 Just Got Smarter: How AWS Prevents Recursive Loops Automatically

AWS Lambda and S3 Just Got Smarter: How AWS Prevents Recursive Loops Automatically

We all love serverless especially when it comes to AWS LAMBDA functions.

There are good reasons to do so for event-driven, decoupling, automation and many other reasons that time aws lambda was released.

But there can be situations when our dear friend LAMBDA can turn in to a foe and cost us money, downtime and those times are unintentional recursive Loops.
In this blog we will talk about a very simple feature yet a powerful one, recursive loop detection between aws lambda and s3 bucket.

Motivation

What is AWS lambda recursive loop and loop detection?

  • When setting up a Lambda function to interact with the same service or resource that triggers it, there's a risk of creating an endless recursive loop.
  • For instance, a Lambda function might send a message to an Amazon Simple Queue Service (SQS) queue, which then triggers the same function again.
  • This leads to a cycle of repeated invocations.

What does it mean for you?

  • Unintended recursive loops can lead to skyrocketing costs, as your AWS account gets hit with unexpected charges.
  • Even worse, these loops can cause your Lambda functions to rapidly scale out of control, consuming all of your account's available concurrency. This not only affects the current function but can also bring down other critical processes, leaving your entire system overwhelmed and unresponsive. It's a costly mistake you don't want to face!

THIS IS THE EXACT PROBLEM AWS LAMBDA LOOP DETECTION TRIES TO SOLVE.

What this Update mean for you?

  • Until now aws Lambda loop detection was supported only for Amazon SQS and Amazon SNS and from today s3 also join to this group.

How Lambda detects loop

  • Lambda detects recursive loops using AWS X-Ray tracing headers. When supported AWS services send events to Lambda
  • They include metadata that tracks how many times the event has invoked the function.
  • If the same event triggers the function about 16 times in a row, Lambda automatically stops further invocations and alerts you. Other triggers for the function aren’t affected by this limit.

You can learn more about here.

Let's see it in Action.

  • Create a s3 bucket, keeping everything default.

s3 bucket creation

  • Create a lambda function which uploads a txt file to s3 bucket whenever lambda is invoked.
  • Note: Lambda default timeout is 3 seconds and Uploading a file takes more than that so increase the timeout to 10 seconds otherwise you will se timeout error.

import boto3
import os
import random
import string

def lambda_handler(event, context):
    # Define the S3 bucket
    s3_bucket = 'your-s3-bucket-name'  # Replace with your S3 bucket name

    # Generate a random string to append to the file name
    random_string = ''.join(random.choices(string.ascii_lowercase + string.digits, k=6))

    # Define the file content and key (with random string appended)
    s3_key = f'your-file-{random_string}.txt'  # The key (path) of the file in S3
    file_content = "Hello, this is a sample text file created by Lambda with a random suffix!"

    # Create the file locally in the /tmp directory
    local_file_path = '/tmp/sample_file.txt'

    # Write content to the file
    with open(local_file_path, 'w') as file:
        file.write(file_content)

    # Initialize S3 client
    s3 = boto3.client('s3')

    # Upload the file to S3
    try:
        s3.upload_file(local_file_path, s3_bucket, s3_key)
        return {
            'statusCode': 200,
            'body': f"File successfully uploaded to {s3_bucket}/{s3_key}"
        }
    except Exception as e:
        return {
            'statusCode': 500,
            'body': f"Error uploading file: {str(e)}"
        }
Enter fullscreen mode Exit fullscreen mode
  • Add trigger to lambda function as S3 trigger.

adding s3 trigger

  • Add S3 permission to the Lambda function IAM role. These permissions will allow aws lambda function to upload file to S3 bucket

  • For the brevity of the blog I will attach AmazonS3FullAccess managed Policy

s3 permission for lambda

  • Upload any file to S3 bucket. After couple seconds you will see files being upload to bucket automatically.

  • You will see Lambda is invoked exactly 16 times in other friends Lambda will upload exactly 16 files and then loop detection will kick in.

File Uplaod

lambda 16 invocations

Ways to Receive notification for Loop detection

  • As per docs there are multiple ways to receive Recursive loop notificationslike AWS Health Dashboard notifications, Email alerts which can take upto 3 hours to receive after Lambda stops a recursive invocation before you receive this email alert.

  • I will update my blog once I recieve them. (added email notification)

email notification

  • But the quickest and most reliable way is to check cloud watch metrics for the function RecursiveInvocationsDropped

The CloudWatch metric RecursiveInvocationsDropped records the number of function invocations that Lambda has stopped because your function has been invoked more than approximately 16 times in a single chain of requests.

Metric

From Solutions Architect POV

  • Knowing There is recursive loop is important but its even more important to understand how to respond in a way that loop do not occur is even more improtant:
    • Since Recursive loop cause Lambda to scale and use all of your account's available concurrency, set available concurrency to zero to restrain future invocations.
    • Most simple way stop the invocation or in other words remove the trigger which invoke lambda
    • Fix you configuration/code which caused the recursive loop in my code its the same bucket name.
  • There is also option to allow lambda in recursive loop if your architecture design demands that.
  • Very critical to understand that Lambda loop detection Is supported for SQL, SNS and now S3. AWS service such as Amazon DynamoDB forms part of the loop, Lambda can't currently detect and stop it.

In this blog we saw how we can save money and avoid downtime for our account's lambda when triggered with S3 leading to recursive loop.

Do you also agree this update just made Lambda even better ?

You can always reach out to me on Linkedin, X

Top comments (0)