In my previous post, I talked about starting an Open-Source GenAI-based terminal application. This week, the task was to contribute a new feature to another user's project. Since we had to collaborate with someone new, I teamed up with Lily, who developed an app with code improvement features similar to mine—except hers has a rat persona!
Feel free to check out her project rat-assistant when you have some time.
Her code is written in TypeScript, and to be honest, I’m not that experienced with it. I was a bit anxious about adding new features without accidentally breaking something. It’s funny—TypeScript feels much harder to me compared to other OOP languages like Java or C++. But I figured this was a good learning opportunity, so I decided to dive in.
Our goal was to add a new option (-t) to display token usage for both the response and the prompt. So, I started by opening an issue on her repo to outline the feature, then forked the project to work locally.
The app was using the GROQ API for its LLM features, and fortunately, there was an easy way to access token usage information via the "usage" field in the API response:
Since the app used yargs for command-line arguments, adding another option was fairly simple. I made it so that, if the user specified -t or --token-usage, the app would display token information at the end of the output along with the AI response. I tested it a few times and made sure it didn't break existing features. When that was confirmed, the code was pushed to my fork, and I made a pull request.
It had been a while since I made a pull request, so I quickly googled the commands and discovered there's an easy way to do it through VS Code (seriously, where would I be without it?).
I added a brief explanation of the new feature and submitted the pull request:
That's when I saw the notification of lily's pull request on my repo a couple of hours ago. I quickly checked the code, tested it locally and it worked great! I tested to see if there are any issues with other options and there weren't any. I was using the Gemini API, so it was a different from the Open AI Chat Completion API commonly used by others, but she still managed to make it work.
I didn’t find any issues or improvements to suggest, so I accepted her pull request and merged it into the main branch. It was pretty fun (and nerve wracking) having someone contribute to your code because you don't know what to expect and whether they would have issues because of your runic code.
But everything went smoothly in the end, and it made me appreciate how large open-source projects collaborate and improve asynchronously through pull requests.
Top comments (0)