DEV Community

Discussion on: The myth of "never going back to fix it later"

Collapse
 
phantas0s profile image
Matthieu Cneude

Totally agree on that. If you have a team who care about what they are doing, they will refactor, except if the company create an environment where it's difficult: constant death march, short deadline for the "good stress", and so on.

If the team itself doesn't want to refactor anything, somebody should look at the hiring process (which is, I think, less than perfect in our industry).

Collapse
 
codemouse92 profile image
Jason C. McDonald

Right on! :) I regularly remind my team that refactoring and bugs are a natural (and necessary) part of the development process.

Although there are exceptions, "you always throw the first one away" tends to be largely valid. The first attempt at solving a problem is often an effective proof-of-concept, during the building of which the team comes to better understand the hidden problems and snarls that will need to be solved. There's always a canyon of difference between "working code" and "good code", but it's hard to write "good code" for a problem that isn't nailed down beyond its original specification.

But once the first attempt is reasonably functional, or even if it hits an impenetrable roadblock, the team is in a better headspace to then go back and refactor it. Workarounds are replaced with better matching design decisions. Bugs are fixed as a side effect of improved architecture. Because the problem is better defined, so is the solution.

Even once a "good" version is shipped, the code is never truly "done". I really try to emphasize making design decisions that leave room for later improvement: DRY techniques, SOLID classes, separation of concerns, pure functions, loose coupling to implementation.

(And yes, it definitely helps to have a team that's willing to engage with this process. We've gotten better at finding those people in the last few years.)

Collapse
 
phantas0s profile image
Matthieu Cneude

Totally agree ;)

As you said, the knowledge you have of a problem will grow overtime. I think that's a major argument to come back to your past misconceptions and try to improve them. Delaying important decisions as much as you can is, I think, a good strategy too.

Thread Thread
 
codemouse92 profile image
Jason C. McDonald

While it's true that "premature optimization is the root of all evil," there's certainly a lot to be said for leaving room for optimization.

In other words, while delaying important decisions, one should always be certain they aren't obstructing said future decision! It's a tricky balance, of course, but what isn't.

Thread Thread
 
phantas0s profile image
Matthieu Cneude

"premature optimization is the root of all evil" is more about performance.

Thread Thread
 
codemouse92 profile image
Jason C. McDonald

I know, but the idea ports to other forms of optimization, such as scalability to a load many times greater than a realistic scenario, adding support for extensions to a single-purpose tool, and so forth. I see the Java habit of "always use double" even when the data will never need to store to more than single decimal place as an example of this, too.