Introduction
The Linux Foundation Mentorship Program offers a dynamic 12-week internship where participants engage in hands-on projects while receiving a stipend. When I first applied for the LFX mentorship program, I was a novice in the field and had no idea how this experience would impact my professional growth. Looking back now, I can confidently say that my journey within the LFX program has been transformative, teaching me the power of perseverance and learning.
One of the main reasons I was drawn to the LFX mentorship program was the fact that the project on which the mentee will be working was already set before the beginning of the application process. This allowed me to choose a project that aligned with my interests and skill set.
Early Challenges and Overcoming Them
My initial applications to the LFX program back in 2022 were unsuccessful, which served as a stark reminder of the competitive nature of this program. However, I refused to let the setbacks deter me and immersed myself in various open-source communities. This period was crucial for me as I honed my skills, expanded my understanding, and became more comfortable diving into any codebase quickly. This groundwork enabled me to craft a compelling proposal for the LFX program in Fall 2023, leading to my acceptance into my first-choice project – a perfect match for an area I was looking forward to improving.
The Mentorship Experience
My mentors, Sayan Mondal and Saranya Jena, were instrumental in my journey, providing guidance and support throughout the program. Our weekly meetings became a cornerstone of my learning process, where I shared progress, discussed challenges, and received tasks for the upcoming week. These tasks were primarily focused on adding unit tests for both the authentication server and unit, incorporating a new documentation for Chaoscenter API.
After a quick adaptation, I got my first task in the second week: to modify the code architecture to interface model for the environment service package and write test cases for it. Which I successfully accomplished. The satisfaction of having my first contributions merged was unparalleled.
Implementation
- Test Cases for Auth server
1. Modifying the code architecture to interface model:
The first step in implementing test cases for the authentication server was to modify the code architecture to interface model. This was done to ensure that the code was modular and easy to test. This made it easier to test each piece of code individually and ensure that it was working as expected.
2. Creating a mock client of the services used by both GRPC and REST API:
The second step was to create a mock client of the services used by both GRPC and REST API. This was done to ensure that the test cases were isolated from the actual services. By creating a mock application service, I could simulate the behavior of the services without actually calling them.
3. Writing test cases to improve test coverage:
The final step was to write test cases to improve test toward improving the coverage. For each test, I created a new instance of the MockApplicationService and then passed a data test for different scenarios. One major issue I faced during this work was that the first scenario tested was directly impacting the new service instantiated from the mocked was affected by the first scenario that was run, so I was forced to use one scenario by test cases, but later on, I figured out that each scenario should have their one instance of the MockApplicationService, from there I was able created more a test cases and improved the coverage. Here is that particular PR
- Tests Cases for Frontend Views Components
LitmusChaos uses a TestWrapper to encapsulate most of the dependencies that need to be mocked. For the first tests I wrote, I had to mock 2 main dependencies, the ReactqueryProvider and ApolloClient, but in order to ensure consistency and simplicity across the test cases, I implemented them into the TestWrapper and used the TestWrapper to wrap each component I wanted to test.
- Incorporating API documentation for ChaosCenter API
For this, I decided to use an OPENAPI SPEC that will be used to automatically generate API documentation for each handler. For this work, I discovered a library called swaggo that I used to define annotations, from the description of the handler to defining responses. I created a doc file that lists all responses and errors from each endpoint, and to finish, I used go-swagger library to generate the API doc from the OPENAPI SPEC file generated by swaggo. Click here to learn more about the new API documentation of Litmus Chaoscenter
Achievements and Realizations
Throughout the program, I raised nine PRs, with eight successfully merged and one under review. My work not only increased the backend coverage from 0 to 25.48% and enhanced the frontend coverage from 0 to 14.56%. One of my essential suggestions was the implementation of a new workflow to track test coverage in the Chaoscenter auth server. My contributions laid a foundation for future enhancements by other contributors.
Moreover, I was able to register all 34 REST endpoints defining both Responses and the Return Error, which enables contributors to easily update the documentation page when a new endpoint is created just by modifying the annotations. This simplified the documentation process and made it more accessible for newcomers and existing contributors.
Challenges Faced
The journey was not without its challenges. One of the major issues I encountered during that time was writing positive test cases. I struggled to write positive tests for the GRPC and environment handler. I tried different approaches to mocking most of the services used by the handler, but after doing unsuccessful research on the internet, I asked for help from the community. I got referred to different resources on how to mock handlers using GIN. All these resources were super helpful and helped me write positive test cases and close this PR. That taught me that even though you have to figure out things on your own, it is OKAY to ask for help. It was my first time writing unit tests at this scale for a large project, but I learned, researched, and asked for help from my mentors.
What's Next?
After spending the whole fall working on this project, I plan to continue contributing to the testing strategy of LimusChaoscenter by helping new contributors who want to contribute to the project. I currently have one PR pending and a testing strategies documentation for Chaoscenter frontend to add. After that, I will maintain existing test cases and work on new ones.
Conclusion
The LFX mentorship program has been the most rewarding experience of my year. It has transformed me from a beginner in open-source to a confident contributor capable of making meaningful improvements. This experience enhanced my technical understanding of writing test cases, improved my Golang skills, and helped me gain insights into my code. I have learned that anything is achievable with the right amount of energy and time. Don't hesitate to submit your proposal for next quarter.
Thanks for reading.
By Magnim Thibaut F. Batale
Top comments (0)