Two stories from the development annals of MousePaw Media that go right along with this:
(1) We needed to implement tests for C++, but I had two unusual requirements:
Embed tests in production binaries, so it can be run on any machine without special frameworks or tools,
An embedded benchmarker, to catch performance regressions.
I knew that the major testing frameworks didn't have the first option; they required test harnesses and the like. I also knew that other, less ubiquitous libraries might have both features, but I didn't want to waste dozens or hundreds of hours researching all the obscure testing frameworks, and then having to take my chances learning something that probably lacked sufficient documentation. (Learning is non-transferrable, so I'd have to train the rest of my staff, mostly interns, on it too!)
Since I already knew what I wanted, I simply wrote my own testing framework for C++. Two years later, it's an investment I'm glad I made! I've been able to add features as we need them, and the benchmarker has proven itself incredibly useful in writing efficient code.
I've been mocked on occasion for writing a new testing framework, instead of just using something someone else wrote, but the payoff has been clear for me. Goldilocks is well-documented, easy to introduce my interns to, and well-suited to expansion as our testing needs evolve.
(2) On the flip side, we've also had cases where we started coding something, only to discover that we didn't need it after all!
We needed an XML parser for C++, so I assigned one of the interns to find one and get a proof-of-concept put together. (For anyone unfamiliar with the idea, this sort of proof-of-concept is a simple, bare-bones project that uses the target technology or library, and few or no other dependencies. It allows you to discover and solve basic implementation problems before you hit them in production, which is a huge time saver!)
The intern researched the topic, found that Xerces is almost ubiquitously recommended, and tried to work with it. It was obtuse, overpowered, and poorly documented. He checked for alternatives, as did another staff member and myself, and Xerces seemed to be it, so we decided to create a wrapper for that library that would simplify its use.
About a year into it, we discovered that another library that had utterly escaped our research, pugixml, existed that could do what we needed with a sane, well-documented API. We dropped Xerces and ended the wrapper project at that point.
I spent a lot of effort getting the boost test framework to function as I wanted in C++. I don't think it was worth it in the end. In another project I wrote my own, and it went quicker, and was smoother.
These are absolutely the types of situations I'm talking about. They are more common than one might think.
But I curse you for bringing up the name Xerces again. I had successfully erased memory of that project, and now it's back!
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Two stories from the development annals of MousePaw Media that go right along with this:
(1) We needed to implement tests for C++, but I had two unusual requirements:
I knew that the major testing frameworks didn't have the first option; they required test harnesses and the like. I also knew that other, less ubiquitous libraries might have both features, but I didn't want to waste dozens or hundreds of hours researching all the obscure testing frameworks, and then having to take my chances learning something that probably lacked sufficient documentation. (Learning is non-transferrable, so I'd have to train the rest of my staff, mostly interns, on it too!)
Since I already knew what I wanted, I simply wrote my own testing framework for C++. Two years later, it's an investment I'm glad I made! I've been able to add features as we need them, and the benchmarker has proven itself incredibly useful in writing efficient code.
I've been mocked on occasion for writing a new testing framework, instead of just using something someone else wrote, but the payoff has been clear for me. Goldilocks is well-documented, easy to introduce my interns to, and well-suited to expansion as our testing needs evolve.
(2) On the flip side, we've also had cases where we started coding something, only to discover that we didn't need it after all!
We needed an XML parser for C++, so I assigned one of the interns to find one and get a proof-of-concept put together. (For anyone unfamiliar with the idea, this sort of proof-of-concept is a simple, bare-bones project that uses the target technology or library, and few or no other dependencies. It allows you to discover and solve basic implementation problems before you hit them in production, which is a huge time saver!)
The intern researched the topic, found that Xerces is almost ubiquitously recommended, and tried to work with it. It was obtuse, overpowered, and poorly documented. He checked for alternatives, as did another staff member and myself, and Xerces seemed to be it, so we decided to create a wrapper for that library that would simplify its use.
About a year into it, we discovered that another library that had utterly escaped our research, pugixml, existed that could do what we needed with a sane, well-documented API. We dropped Xerces and ended the wrapper project at that point.
I spent a lot of effort getting the boost test framework to function as I wanted in C++. I don't think it was worth it in the end. In another project I wrote my own, and it went quicker, and was smoother.
These are absolutely the types of situations I'm talking about. They are more common than one might think.
But I curse you for bringing up the name Xerces again. I had successfully erased memory of that project, and now it's back!