Second Part: link. This part contains the code and explanation.
During my internship with YellowClass, I came through developing an interesting project. Little did I know that this project would end up feeling like a hackathon. In this post, I'll be sharing the journey I went through while developing this project. From the initial architecture and the thought process behind it, to the challenges I faced and how I overcame them, I'll be detailing my experiences step-by-step. So, buckle up and join me as I take you through my journey of developing this exciting project.
How it started?
It was a friday evening around 7PM when our team manager brought up an interesting idea, the possibility of creating an Instagram-style auto-play section for the reel on our app's home page. Instead of displaying a static image, the reel would showcase a short snippet of the content, enticing the user to click and watch the full reel. This sparked a conversation between the app team and product managers, as they brainstormed and debated the best approach for implementing this new feature. I was sitting and doing the jira's assigned to me as well as listening to the whole conversation(not very focused on work that day 😅).
As everyone discussed the possibility of creating an auto-play section for the reel on our app's home page, the team began brainstorming different ways to display short snippets of content. Someone suggested using GIFs, as it would require no app change, but there was a catch. They had previously tried using GIFs, but found that the size of converting to them was too large for our needs. I also began to explore different libraries in Python to see if I could find a way to create smaller, more compressed GIFs. After some experimentation, I discovered a library that allowed me to tweak certain parameters and achieve the desired result, it was moviepy. My initial code was a simple snippet that would take a video path and create a GIF for the first 5 seconds of the video. I shared this with my manager and the engineering director, and to our delight, the snippet was able to generate GIFs under 1MB for many of our videos. While we still needed to make some tweaks to get under the 500KB limit, we finally had a viable solution. Best of all, since it was a script, we could simply give it the links to all of our reels and upload the resulting GIFs to our AWS S3 bucket without having to involve the content team, this could save their one week work.
With the decision made to use GIFs as our solution, it was time to create a roadmap for the project. It was already 8PM, and most of the office had emptied out for the day. We knew that we needed to act quickly to make progress. The project was divided into two distinct steps. The first step was to replace the existing static images for the reels with the newly generated GIFs. The second step was to automate the process of generating and storing GIFs for any new video that was uploaded to the app. This would be a more complex undertaking.
With step one of the project in motion, our manager and engineering director began to write a query to get the reels and their uploaded paths in a proper CSV format. They also worked on changing the backend to accommodate the new GIF format. Meanwhile, I was tasked with writing a Python script that would read the CSV and generate a GIF for each video, then upload it to our S3 bucket. To ensure that the GIFs were compressed to the desired size, we added an additional step that would retry the process 3-4 times if a particular GIF was larger than 500KB. We also included a feature to reduce the frame rate until the file was within the desired size range. Additionally, we knew that we would need to make the script multithreaded in order to generate GIFs for the approximately 900 reels in a reasonable amount of time. This was a complex task, but we were determined to see it through to the end.
After one hour of hard work, our script, CSV, and backend were finally ready to be put to the test. With a deep breath and a sense of excitement, we ran the script, watching as the timer ticked down to the finish line. Two hours was the estimated time to complete the task, and we eagerly watched the number of generated GIFs climb higher and higher on the large screen in front of us. As we waited for the script to finish, we took a break to order and enjoy a much-needed dinner, our eyes glued to the screen the entire time. The anticipation was almost too much to bear. But finally, the script came to a close, and we could see the results of our hard work displayed in front of us.
With eager anticipation, I opened the app and navigated to the home page, eager to see the results of our hard work. But to my surprise and shock, none of the videos were displaying GIFs. My heart sank, and I was overcome with a sense of disappointment. But then, my manager opened his phone, and to my relief, every video was displaying a GIF. I quickly realized that we were in the midst of an A/B testing, and that our hard work had indeed paid off. We tested everything thoroughly and found that everything was working perfectly. Finally, as the clock approached midnight, we wrapped up our hackathon. Now, we could monitor the click rates on reels during the weekends and make informed decisions based on the data we collected.
This post is already going longer than I expected 😅. In the part-2 of this post I will share the code snippet as well as the working process.
Top comments (2)
@devtonic its written with the help of chatgpt, I gave it what I wanted to convey and it returned me a well contructed paragraph for it, example on typing "convert this string to convey a story 'string' " and it returned a paragraph written much better than I could write even with grammarly.
This post really doesn't look like it was written by ChatGPT. Right, guys? Right...?