DEV Community

ElshadHu
ElshadHu

Posted on

Adding Automated Testing to My CLI Tool

How I Set Up Testing Environment

I chose googletest for my project. Here's my commit. During Hacktoberfest, I explored a lot of repos that already had test cases in order to reduce the miserable side of software(crashing). I had previous experience with Jest.js, a JavaScript testing framework, but this was my first time using a testing framework in C++.

Setup Process

First, I created a tests folder and configured the root CMakeLists.txt. The official user guide provided most of the information I needed for setup and implementation. Next, I configured tests/CMakeLists.txt to avoid writing repetitive CMake code for each test file.

Understanding Different Test Types

This section in the user guide gave me a great overview of the different test types Google Test has.

1. TEST- Basic Test

This is for simple tests that don't need any setup or shared state. The syntax is TEST(TestSuiteName, TestName). For the getLanguageExtension() function I chose TEST, because it's a pure function that doesn't create files or modify global state. Here's one of the samples:

TEST(UtilsLanguageTest, DifferentLanguages) {
    EXPECT_NE(getLanguageExtension(".unknown"), "cpp");
    EXPECT_EQ(getLanguageExtension(""), "text"); 
    //as it goes same structure because of simplicity ... 
}
Enter fullscreen mode Exit fullscreen mode

2. TEST_P- Parameterized Test

The function I tested:

bool FilterManager::isMatchingFilters(const std::filesystem::path& filepath) const
Enter fullscreen mode Exit fullscreen mode

I chose TEST_P because this function is complex and needed testing with many different input combinations. I created a test data structure to avoid repetitive code in filter_test:

struct TestCase {
    std::string fileName;
    std::string includePattern;
    std::string excludePattern;
    bool expectedResult;
};
Enter fullscreen mode Exit fullscreen mode

Also, in this module I used SetUp() to run before each test:

void SetUp() override {
    std::ofstream("test.txt") << "content";
}
Enter fullscreen mode Exit fullscreen mode

I used TearDown() which cleans up files after each test runs:

void TearDown() override {
    std::filesystem::remove("test.txt");
}
Enter fullscreen mode Exit fullscreen mode

These above functions are really useful in order to not repeat the same thing for each test case. Finally, I used INSTANTIATE_TEST_SUITE_P() to provide the actual test data.

3. TEST_F- Test Fixture

The function I Tested:

void writeFileStatistics(std::ostream& o, const std::filesystem::path& path, const cli::Options& opt)
Enter fullscreen mode Exit fullscreen mode

This function requires real files and directories because it checks file existence and counts lines in actual files. I chose TEST_F because every test needed a test directory, test files with specific line counts, and guaranteed cleanup even on test failure. Here's how I set up in one part of file_stats test file:

TEST_F(RendererTest, CompleteValidation) {
    createTestFile(testDir / "test1.txt", 15);
    createTestFile(testDir / "test2.txt", 20);
    cli::Options options;
    std::ostringstream output;

    output::writeFileStatistics(output, testDir, options);
    std::string result = output.str();

    EXPECT_FALSE(result.empty());
    EXPECT_TRUE(result.length() > 30);
    EXPECT_LE(result.length(), 50);
}
Enter fullscreen mode Exit fullscreen mode

The helper function createTestFile() makes creating test files simple and readable.The critical advantage of TEST_F is that TearDown() always runs, even if the test fails or crashes.

Conclusion

After implementing all my tests in the testing branch, I used git rebase -i to squash my commits before merging. I also updated the README with information about running the tests. By the end, I had successfully added comprehensive automated testing to my project. I'm planning to make testing a habit when adding new features

Top comments (0)