DEV Community

Cover image for AWS S3 Triggering Lambda
Márcio Coelho
Márcio Coelho

Posted on • Edited on

AWS S3 Triggering Lambda

We’ll create an event-driven Lambda that listens for new images in an S3 bucket (photos-raw), resizes them using Node.js, and saves the output to another bucket (photos-resized).

You’ll learn how to:

  • Configure S3 trigger events in AWS SAM
  • Resize images in Lambda using sharp

📝 template.yml

AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31

Globals:
  Function:
    Timeout: 29
    Runtime: nodejs22.x
    MemorySize: 512

Resources:
  ResizeFunction:
    Type: AWS::Serverless::Function
    Properties:
      Handler: src/resize.handler
      Policies:
        - S3ReadPolicy:
            BucketName: !Sub "photos-raw-${AWS::AccountId}"
        - S3WritePolicy:
            BucketName: !Sub "photos-resized-${AWS::AccountId}"
      Events:
        S3Upload:
          Type: S3
          Properties:
            Bucket: !Sub "photos-raw-${AWS::AccountId}"
            Events: s3:ObjectCreated:Put
      Environment:
        Variables:
          RESIZED_BUCKET: !Sub "photos-resized-${AWS::AccountId}" 

  RawBucket:
    Type: AWS::S3::Bucket
    Properties:
      BucketName: !Sub "photos-raw-${AWS::AccountId}"

  ResizedBucket:
    Type: AWS::S3::Bucket
    Properties:
      BucketName: !Sub "photos-resized-${AWS::AccountId}"
Enter fullscreen mode Exit fullscreen mode

Whenever someone uploads a new file to the photos-raw S3 bucket, automatically trigger (run) the Lambda function.

🧠 Function

import { S3Client, GetObjectCommand, PutObjectCommand } from '@aws-sdk/client-s3';
import sharp from 'sharp';
import { S3Event } from 'aws-lambda';
import { Readable } from 'stream';

const s3 = new S3Client({});
const RESIZED_BUCKET = process.env.RESIZED_BUCKET

const streamToBuffer = async (stream: Readable): Promise<Buffer> => {
  const chunks: Uint8Array[] = [];
  for await (const chunk of stream) chunks.push(chunk);
  return Buffer.concat(chunks);
};

export const handler = async (event: S3Event) => {
  for (const record of event.Records) {
    const bucket = record.s3.bucket.name;
    const key = decodeURIComponent(record.s3.object.key.replace(/\+/g, ' '));

    const { Body } = await s3.send(new GetObjectCommand({ Bucket: bucket, Key: key }));

    if (!Body || !(Body instanceof Readable)) {
      console.error('Invalid body stream');
      return;
    }

    const imageBuffer = await streamToBuffer(Body);
    const resized = await sharp(imageBuffer)
      .resize(800)
      .toBuffer();

    await s3.send(new PutObjectCommand({
      Bucket: RESIZED_BUCKET,
      Key: key,
      Body: resized,
      ContentType: resized.ContentType,
    }));
  }
};

Enter fullscreen mode Exit fullscreen mode

Conclusion

You've now built an event-driven image pipeline using:

  • 🪣 S3 Triggers
  • 🧠 AWS SAM for infrastructure as code
  • ⚙️ Sharp for image processing

Top comments (0)