I recently completed a small project in which I used AWS resources to automate the process of transcoding 4K video files. With the combination of S3, Lambda, Elastic Transcoder pipeline and SNS I was able to do this task effortlessly. Once a file is uploaded to an S3 source bucket it is then converted and saved in the transcoded complete bucket. SNS notifies of the initiation of this process and of its completion. This is incredibly useful for image handling as well. Below are the steps I took to complete this, along with my findings once finished.
- Before beginning this project, I created the initial resources needed to get things started. First, I created 3 separate S3 buckets and attached bucket policies to allow them to be accessed publicly. One bucket for the source uploaded content, one bucket for the completed transcoded files and lastly a bucket for thumbnail images to be saved.
- The next step was to create and subscribe to a SNS topic. For this protocol I chose to use email and SMS notifications. Once this was complete I created the subscription and verified my email. You can also publish a test message through SNS as well to verify all notification sources are receiving the messages.
- With the initial sources setup it was now time to create and initiate my Elastic Transcoder pipeline. I named it and then set my input bucket as my S3 source bucket. For transcoded files I selected my transcoding completed bucket and finally for thumbnails I selected my thumbnails S3 bucket. Pipeline now initiated and I made note of my pipeline ID for later use.
- Now on to the Lambda function creation. I created my function from scratch and used Python 3.7 (Boto3 SDK) for my runtime. For a role, I created a custom role and policy and created my Lambda function.
- At this point I configured an S3 trigger for my function. I made my source bucket the trigger and selected "All object create events". Essentially this will relay to Lambda that a file has been added and that my function has a task to complete. Now it's time to add the function code into the editor and save. Finally, I added an environment variable and made the key "PIPELINE_ID" and the value of the actual ID.
- Now it's finally time to test. I downloaded a few short 4K clips and uploaded the first to my S3 source bucket. I was notified via text and email that a transcoding job had begun. Also notified when completed as well. At this time I checked my transcoding completed bucket and my thumbnails bucket and verified the files were uploaded. Everything working properly and my simple transcoding project is complete. Finally I checked my CloudWatch log files and confirmed there as well.
This was a fun little project and could be incredibly helpful with image conversion as well. Lambda can be useful for tasks such as this and I plan to dive deeper with more complicated workflows. Thanks for reading!
Jerry C. Mullis