This week, I focused on improving dev-mate-cli by adding robust testing with Jest, a popular JavaScript testing framework. The goal was to enhance reliability and encourage contributions by ensuring new changes align with the project’s standards. Here’s a breakdown of the tools I used, the setup process, lessons learned, and how testing improved the code.
mayank-Pareek / dev-mate-cli
A command line tool to quickly document your code
dev-mate-cli
A command-line tool that leverages OpenAI's Chat Completion API to document code with the assistance of AI models.
Watch this Demo video to view features.
Features
- Source Code Documentation: Automatically generate comments and documentation for your source code.
- Multiple File Processing: Handle one or multiple files in a single command.
-
Model Selection: Choose which AI model to use with the
--model
flag. -
Custom Output: Output the results to a file with the
--output
flag, or display them in the console. -
Stream Output: Stream the LLM response to command line with
--stream
flag.
Usage
Basic Usage
To run the tool, specify one or more source files or folders as input:
npm start ./examples/file.js
For processing multiple files:
npm start ./examples/file.js ./examples/file.cpp
For processing folders:
npm start ./examples
Command-line Options
Note: Use npm start -- -option
to pass the option flag to the program, as npm…
Why Jest?
I chose Jest for its popularity, TypeScript compatibility, powerful mocking capabilities, and large community. Jest is widely adopted in the JavaScript/TypeScript ecosystem and provides built-in support for many essential features like snapshot testing, coverage reports, and a simple setup. Since I already had some experience with Jest in JavaScript, I was excited to deepen my understanding by integrating it with my project that uses TypeScript and ESLint.
Setting Up Jest with TypeScript and ESLint
Configuring Jest to work smoothly with TypeScript and ESLint was more complex than I anticipated. I first installed Jest as a dev dependency and added three most-relevant scripts:
"test": "jest -c jest.config.js --runInBand --",
"test:watch": "jest -c jest.config.js --runInBand --watch --",
"coverage": "jest -c jest.config.js --runInBand --coverage"
For the initial test case, I picked a utility function, checkIterable
, that checks if an AI response can be streamed. Writing tests for it revealed a bug: it didn’t handle null inputs properly, returning null instead of false. I corrected this by ensuring it returns a Boolean consistently. This first success confirmed that tests could effectively catch errors in code logic, saving time later in the development process.
Mocking AI Responses with Jest
The biggest challenge was simulating AI responses from OpenAI's API. Initially, I tried using nock to intercept HTTP requests, but while researching I found that OpenAI had mocking capabilities for its constructor , which let me use captured JSON responses to simulate interactions with the model.
I structured the main ai.test.ts
to cover three essential cases: Basic AI Response, API Errors and Missing Environment Variables. With the jest.spyOn
function, I could capture console logs and errors, confirming that messages were displayed as expected for both successful and failed API calls. This approach allowed me to identify gaps in error handling.
Lessons Learned
While I had limited experience with testing, setting up Jest for TypeScript and experimenting with AI response mocking deepened my understanding of automated testing. This process was a great reminder of how thorough testing improves code quality and promotes a smoother contribution experience. Going forward, I’m committed to making testing a core part of my development workflow and look forward to expanding the test suite as dev-mate-cli
evolves.
Top comments (0)