<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Shivam Saxena</title>
    <description>The latest articles on DEV Community by Shivam Saxena (@code_it_right).</description>
    <link>https://dev.to/code_it_right</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/code_it_right"/>
    <language>en</language>
    <item>
      <title>Thumbnail Generation for Images and Videos using AWS Lambda</title>
      <dc:creator>Shivam Saxena</dc:creator>
      <pubDate>Sat, 22 Mar 2025 09:49:35 +0000</pubDate>
      <link>https://dev.to/code_it_right/thumbnail-generation-for-images-and-videos-using-aws-lambda-4f7h</link>
      <guid>https://dev.to/code_it_right/thumbnail-generation-for-images-and-videos-using-aws-lambda-4f7h</guid>
      <description>&lt;p&gt;In this post we will implement a lambda function to generate thumbnail for images and videos whenever a file is uploaded in a bucket.&lt;/p&gt;

&lt;h2&gt;
  
  
  Requirements
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;a href="https://www.npmjs.com/package/sharp" rel="noopener noreferrer"&gt;Sharp&lt;/a&gt;: We will be using sharp plugin for resizing images to create thumbnails.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.ffmpeg.org/" rel="noopener noreferrer"&gt;ffmpeg&lt;/a&gt;: We will be using ffmpeg to generate thumbnail images from videos.&lt;/li&gt;
&lt;li&gt;AWS Admin Account with console and services access.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Overview
&lt;/h2&gt;

&lt;p&gt;Following are the steps that will be explained below in detail.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Source and Destination Buckets&lt;/li&gt;
&lt;li&gt;Lambda Execution policy and role&lt;/li&gt;
&lt;li&gt;Function Code Package&lt;/li&gt;
&lt;li&gt;Lambda Trigger&lt;/li&gt;
&lt;li&gt;Testing&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Source and Destination Buckets
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Go to the &lt;strong&gt;Amazon S3&lt;/strong&gt; &amp;gt; &lt;strong&gt;General Purpose Buckets&lt;/strong&gt; &amp;gt; &lt;strong&gt;Create Bucket&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select bucket type as &lt;strong&gt;General Purpose&lt;/strong&gt; and enter &lt;code&gt;s3-demo-source&lt;/code&gt; as the name of the bucket. You can use any name just make sure that source and destination buckets have different names. Also refer to the &lt;a href="https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucketnamingrules.html" rel="noopener noreferrer"&gt;S3 Bucket Naming Convention&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Keep rest of the fields as default and create the bucket.&lt;/li&gt;
&lt;li&gt;Use Step 2 - 4 to create the destination bucket as well.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fefjpey1d4j3c16p6s4ke.png" alt="Creating buckets" width="800" height="187"&gt;
&lt;/li&gt;
&lt;li&gt;Use this &lt;a href="https://sample-videos.com/download-sample-png-image.php" rel="noopener noreferrer"&gt;link&lt;/a&gt; to download a sample image. Click on the source bucket name and click on Upload to upload this image.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frrwh0v40osyevvs3cglv.png" alt="Uploading sample image" width="800" height="364"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;S3 buckets use a global namespace, meaning the bucket name has to be unique such that NO ONE else is using the name. Therefore, I have used such names for source and destination bucket. If a name you use is already taken, AWS Console will throw an error and you can change the name till you find a unique one.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Lambda Execution policy and role
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Go to &lt;strong&gt;IAM&lt;/strong&gt; &amp;gt; &lt;strong&gt;Policies&lt;/strong&gt; &amp;gt; &lt;strong&gt;Create Policy&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Switch to JSON Tab and add the following policy code. This policy code involves access to lambda function for logging in CloudWatch logs, fetch and put object in S3:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "logs:PutLogEvents",
                "logs:CreateLogGroup",
                "logs:CreateLogStream"
            ],
            "Resource": "arn:aws:logs:*:*:*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": "arn:aws:s3:::*/*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject"
            ],
            "Resource": "arn:aws:s3:::*/*"
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Click on &lt;strong&gt;Next&lt;/strong&gt; and click on &lt;strong&gt;Create Policy&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff67av1bf8xjzzporrcyu.png" alt="Create Policy" width="800" height="425"&gt;
&lt;em&gt;Notice the permissions in the Permissions Table&lt;/em&gt;.&lt;/li&gt;
&lt;li&gt;Go to &lt;strong&gt;IAM&lt;/strong&gt; &amp;gt; &lt;strong&gt;Roles&lt;/strong&gt; &amp;gt; &lt;strong&gt;Create Role&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;AWS Service&lt;/strong&gt; in Trusted Entity Type and &lt;strong&gt;Lambda&lt;/strong&gt; in Use Case. Click &lt;strong&gt;Next&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffsh3qmk0ma8o5767zqyg.png" alt="Creating role" width="800" height="326"&gt;
&lt;/li&gt;
&lt;li&gt;In Permission Policies, search for the policy created earlier and tick the checkbox of the policy. Click on &lt;strong&gt;Next&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnihfezjuvveqsvbnu722.png" alt="Updating role in Policy" width="800" height="246"&gt;
&lt;/li&gt;
&lt;li&gt;Enter the Role name as &lt;strong&gt;LambdaS3Role&lt;/strong&gt; and click on &lt;strong&gt;Create Role&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Function Code Package
&lt;/h3&gt;

&lt;p&gt;Since we want to install a package &lt;strong&gt;sharp&lt;/strong&gt; and a binary, &lt;strong&gt;ffmpeg&lt;/strong&gt; we need to make a &lt;strong&gt;.zip&lt;/strong&gt; of the code package and upload it in the lambda function code. This step is not required if the lambda function does not depends on external libraries as the code can be written in the online editor in AWS Console.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create a directory and move into it
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir thumbnail-lambda
cd thumbnail-lambda
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Create a new node js project and install &lt;code&gt;sharp@0.32.6&lt;/code&gt; (last lambda compatible version).
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm init
npm install sharp@0.32.6
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Create a new file in the root director named &lt;code&gt;index.mjs&lt;/code&gt; and add the following code:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Imports
import { S3Client, GetObjectCommand, PutObjectCommand } from "@aws-sdk/client-s3";
import { Readable } from "stream";
import sharp from "sharp";
import * as path from "path";
import * as cp from "child_process";
import * as fs from "fs/promises";

// ffmpeg binary path
const ffmpegPath = "/opt/bin/ffmpeg";

// To validate image and video extensions. Can be updated as required
const CHECK_IMAGE_EXT = ["jpg", "jpeg", "png"];
const CHECK_VIDEO_EXT = ["mp4", "mov", "avi", "webm", "flv"];

// Note: Change region here according to your bucket region on AWS
const s3 = new S3Client({ region: "ap-south-1" });

export const handler = async (event) =&amp;gt; {
  console.log("Received S3 event:", JSON.stringify(event));

  try {
    const record = event.Records[0].s3;
    // Source bucket
    const sourceBucket = record.bucket.name;
    // File name
    const fileName = record.object.key;
    // Destination bucket
    const destBucket = process.env.DESTINATION_BUCKET || sourceBucket;

    const fileExt = fileName.match(/\.([^.]*)$/);
    if (!fileExt) throw new Error(`Error while determining file type for file ${fileName}`);

    const fileType = fileExt[1].toLowerCase();
    const isImage = CHECK_IMAGE_EXT.includes(fileType);
    const isVideo = CHECK_VIDEO_EXT.includes(fileType);

    if (isVideo) {
      await generateVideoThumbnail(srcBucket, srcKey, dstBucket);
    } else if (isImage) {
      await generateImageThumbnail(sourceBucket, fileName, destBucket);
    } else {
      console.log(`File type is not supported: ${fileType}`);
    }
  } catch (error) {
    console.error("Error processing event:", error);
    throw error;
  }
};

// Get file buffer from S3
const getFileData = async (bucket, fileKey) =&amp;gt; {
  console.log(`Fetching ${fileKey} from ${bucket}`);
  try {
    const response = await s3.send(new GetObjectCommand({ Bucket: bucket, Key: fileKey }));
    if (!response.Body || !(response.Body instanceof Readable)) throw new Error("Invalid response");

    const responses = [];
    for await (const result of response.Body) {
      responses.push(result);
    }

    return Buffer.concat(chunks);
  } catch (error) {
    console.error(`Error fetching ${fileKey} from S3: `, error);
    throw error;
  }
};

// Upload file to S3
const uploadFileToS3 = async (bucket, name, data, contentType) =&amp;gt; {
  console.log(`Uploading ${name} to ${bucket}`);

  try {
    await s3.send(
      new PutObjectCommand({
        Bucket: bucket,
        Key: name,
        Body: data,
        ContentType: contentType,
      })
    );

    console.log(`Uploaded ${name} successfully`);
  } catch (error) {
    console.error(`Error uploading ${name} to S3:`, error);
    throw error;
  }
};

// Generate and upload image thumbnail
const generateImageThumbnail = async (sourceBucket, fileName, destBucket) =&amp;gt; {
  console.log(`Generating thumbnail for image: ${fileName}`);

  try {
    // Fetch file data in the form of buffer
    const fileData = await getFileData(sourceBucket, fileName);

    // Feed buffer to sharp and provide the size of thumbnail. Change this as required and generate buffer of thumbnail.
    const resizedFileData = await sharp(fileData).resize({ width: 200 }).toBuffer();

    // Upload thumbnail to destination bucket
    await uploadFileToS3(destBucket, thumbnailKey, resizedFileData, "image/png");
  } catch (error) {
    console.error(`Error while generating thumbnail for ${fileName}: `, error);
    throw error;
  }
};

// Generate and upload video thumbnail
const generateVideoThumbnail = async (sourceBucket, fileKey, destBucket) =&amp;gt; {
  console.log(`Generating thumbnail for video: ${fileKey}`);

  const extension = path.extname(fileKey);
  const fileName = path.basename(fileKey, extension);
  // Temp path for storing source and thumbnail file
  const tempSourceFile = `/tmp/${fileName}${extension}`;
  const tempThumbnailFile = `/tmp/${fileName}.png`;

  try {
    const fileData = await getFileData(sourceBucket, fileKey);
    await fs.writeFile(tempSourceFile, fileData);

    console.log(`Running ffmpeg for ${fileKey}`);
    await new Promise((resolve, reject) =&amp;gt; {
      const ffmpeg = cp.spawn(ffmpegPath, [
        "-i",
        tempSourceFile,
        "-ss",
        "00:00:01.000",
        "-vframes",
        "1",
        tempThumbnailFile,
      ]);

      ffmpeg.on("close", (code) =&amp;gt;
        code === 0 ? resolve() : reject(new Error(`ffmpeg failed with code ${code}`))
      );
    });

    const thumbnailData = await fs.readFile(tempThumbnailFile);
    const thumbnailFileName = `${fileName}.png`;

    await uploadFileToS3(destBucket, thumbnailFileName, thumbnailData, "image/png");
  } catch (error) {
    console.error(`Error while generating thumbnail for ${fileKey}:`, error);
    throw error;
  } finally {
    console.log(`Removing file ${fileKey}`);
    await Promise.allSettled([fs.unlink(tempSourceFile), fs.unlink(tempThumbnailFile)]);
  }
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Notice that the above code uses source bucket as destination bucket if not provided specifically. This can cause unwanted executions of lambda as each file upload will trigger the lambda for generating thumbnail and putting in S3 which will again trigger the lambda, thus falling in a loop. To prevent this, we can use prefix/suffix in such that the lambda will be only triggered if specified prefix or suffix match.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Zip the function code with the command ensuring that all files including &lt;code&gt;package.json&lt;/code&gt;, &lt;code&gt;package-lock.json&lt;/code&gt; and &lt;code&gt;node_modules&lt;/code&gt; are all included in the zip file.&lt;br&gt;
&lt;code&gt;zip -r function.zip .&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On AWS Console, go to &lt;strong&gt;Lambda&lt;/strong&gt; &amp;gt; &lt;strong&gt;Create a Function&lt;/strong&gt;. Select &lt;strong&gt;Author from scratch&lt;/strong&gt;. For &lt;strong&gt;Runtime&lt;/strong&gt;, select &lt;code&gt;Node.js 22.x&lt;/code&gt;, for &lt;strong&gt;Architecture&lt;/strong&gt; select &lt;code&gt;x86_64&lt;/code&gt;. For &lt;strong&gt;Execution Role&lt;/strong&gt;, select &lt;strong&gt;Use an existing Role&lt;/strong&gt; and select the role created earlier from the dropdown, &lt;strong&gt;LambdaS3Role&lt;/strong&gt; in our case, and hit &lt;strong&gt;Create Function&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx9b23xe7y0ri6tcgw2ic.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx9b23xe7y0ri6tcgw2ic.png" alt="Create Lambda Function" width="800" height="417"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For generating thumbnails from videos, we need the ffmpeg binary as a lambda layer. Download the master build for ffmpeg from this &lt;a href="https://johnvansickle.com/ffmpeg/releases/ffmpeg-release-amd64-static.tar.xz" rel="noopener noreferrer"&gt;link&lt;/a&gt; in a new folder, say layers. Make sure binary file has execution permission. Zip the folder using below command. &lt;br&gt;
&lt;code&gt;zip -r layers.zip layers&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the console, go to the &lt;strong&gt;Lambda&lt;/strong&gt; &amp;gt; &lt;strong&gt;Layers&lt;/strong&gt; &amp;gt; &lt;strong&gt;Create Layer&lt;/strong&gt;. Enter the name as ffmpeg-layer and upload the zip file created in the above step, select &lt;code&gt;x86_64&lt;/code&gt; in &lt;strong&gt;Compatible Architectures&lt;/strong&gt; and click Create.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;If the zip file size is &amp;gt; 10 MB, you can upload the zip file in the source S3 bucket and add the &lt;strong&gt;Object URL&lt;/strong&gt; of the file by choosing the &lt;strong&gt;Upload a file from Amazon S3&lt;/strong&gt; option.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fspoyv7s6b5i8esc9vnho.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fspoyv7s6b5i8esc9vnho.png" alt="Creating Layer" width="800" height="396"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the console, go to the newly created lambda function and click on &lt;strong&gt;Layers&lt;/strong&gt; &amp;gt; &lt;strong&gt;Add Layer&lt;/strong&gt;. Copy the &lt;strong&gt;Version ARN&lt;/strong&gt; from the ffmpeg-layer created earlier. In Add Layer form, select &lt;strong&gt;Specify an ARN&lt;/strong&gt; option and paste the ARN copied earlier and click &lt;strong&gt;Add&lt;/strong&gt;. It should show the layers count as 1 on the function page.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwc7dr2cd27mlcm1samr7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwc7dr2cd27mlcm1samr7.png" alt="Adding Layer" width="800" height="335"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;To add the function code, go the function page, click on &lt;strong&gt;Code&lt;/strong&gt; tab. On the right side, click on &lt;strong&gt;Upload from&lt;/strong&gt; and select &lt;strong&gt;.zip File&lt;/strong&gt;. Select the function.zip file created earlier and the function code should be imported.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Lambda Trigger
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;On the function page, click on &lt;strong&gt;Add Trigger&lt;/strong&gt;, select S3 in Source. In the Bucket option, search for the source bucket, which should be &lt;strong&gt;s3-demo-source&lt;/strong&gt; in our case, and select the same. Acknowledge the Recursive Invocation checkbox and click &lt;strong&gt;Add&lt;/strong&gt;. Now the function is complete and ready to be tested.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3j23fsq040ifj4mnklgr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3j23fsq040ifj4mnklgr.png" alt="Adding Trigger" width="800" height="387"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Test
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Go to the source bucket and upload a sample image which can be used to trigger test on lambda function and copy the file key.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;On the function page, click on the &lt;strong&gt;Test&lt;/strong&gt; tab. Select &lt;strong&gt;Create New Event&lt;/strong&gt; and enter &lt;strong&gt;Event Name&lt;/strong&gt;. In the &lt;strong&gt;Template&lt;/strong&gt; field search for &lt;strong&gt;S3 Put&lt;/strong&gt; event and select the same. Now do the following changes in the Event JSON:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Replace the &lt;code&gt;awsRegion&lt;/code&gt; with the region you are using, ap-south-1 in this case.&lt;/li&gt;
&lt;li&gt;Replace the &lt;code&gt;name&lt;/code&gt;, &lt;code&gt;example-bucket&lt;/code&gt; with the actual source bucket name, in this case &lt;code&gt;s3-demo-source&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Replace the &lt;code&gt;key&lt;/code&gt; with the key of the file copied from the sample image uploaded in the source bucket and click on &lt;strong&gt;Save&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3sex8ef2jlwnhi7sh9st.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3sex8ef2jlwnhi7sh9st.png" alt="Test Event" width="800" height="462"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click on &lt;strong&gt;Configuration&lt;/strong&gt; &amp;gt; &lt;strong&gt;Environment Variables&lt;/strong&gt; and add the variable for destination bucket name.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fznl1qhokaiy5qiyhiidg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fznl1qhokaiy5qiyhiidg.png" alt="Adding Environment Variables" width="800" height="159"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to the &lt;strong&gt;Test&lt;/strong&gt; tab and click on &lt;strong&gt;Test&lt;/strong&gt; to start the testing. You can also view the printing of the logs from the code in CloudWatch by clicking &lt;strong&gt;CloudWatch Logs Live Trail&lt;/strong&gt; button.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuy6e8wdk1j3b72nghcfl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuy6e8wdk1j3b72nghcfl.png" alt="Test tab" width="800" height="146"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Now go to the destination bucket and there should be the thumbnail file generated for the sample image used for testing.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For actual testing, go to the source bucket and upload a sample image or video file. Go to the CloudWatch and observe the lambda logs and after it is completed you can go to the destination bucket where the thumbnail file will be present.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Final Thoughts
&lt;/h3&gt;

&lt;p&gt;A basic lambda function to generate thumbnails is now ready. The function can be further optimized and customized to generate specific size thumbnails and generate video thumbnails at a specific timestamp using ffprobe command. Also, if you want to use the same bucket as source and destination, you can use the Suffix or Prefix option while setting the trigger in Lambda function to prevent recursive execution of the function. Let me know down below if you face any issues while implementing this.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Thank you for reading this long article, I hope it helps✌️.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;Happy Coding!&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>node</category>
      <category>javascript</category>
      <category>serverless</category>
    </item>
  </channel>
</rss>
