I want to agree with such an Oath, but some of the points just bother me.
2 & 3) The code I produce is constrained by the resources provided to me in the project. Blindly seeking perfection is a sure way to sink a project. Everything is a compromise, and this includes code quality. While we shouldn't write crap code with no tests nor should we attempt to write ideal code with perfect tests.
8) I kind of have this feeling that if a business depends on accurate programming estimates they're already off to a bad start. I agree we should try to make reasonable estimates, but we should also have a process that can deal with inaccuracies.
1) Is a moral statement that would obviously conflict with anybody working on software in the defense industry.
Thanks for your input! You're actually one of the people who inspired me to write this after I read 'I'm proud to be a programmer'
I think I agree on the 'harmful code' point. This is where making a summary of the video is actually harmful to the point :D .I apologise for that.
I guess it comes down to every single person's judgement. One could focus on not producing harmful code for your customers, or your fellow developers, or your organisation. Your example about defense industry is a good point. We could also add programmers working for the government and asked to write some code to enforce a law that could be considered 'harmful' to certain citizens. I'm sure we could find many different examples such as yours where the 'harmful' part is only a matter of perception.
Bob Martin makes a difference about code defective in behavior and code defective in structure. I honestly don't fully understand the difference between the two. I'm sure something more experienced than me could explain that.
As you said, it really depends on the environment in which you work. And I believe that we could create many different statements and oaths to make ourselves feel better. But in the end, if the entire community of programmers ( or a big chunk of it ), doesn't believe in such an oath, well, management would believe in it, and nothing will be done.
Maybe we can agree that "harmful" means something counter to the primary stakeholders. I think that's the primary intent. Of course it doesn't prevent virus writers, since their primary stakeholders are the scum who hire them. It's a hard point to make. It has the same problems in the hippocratic oath for doctors "do no harm". It's well meant, but fuzzy.
Defective in behaviour likely means it doesn't fulfill it's requirements. Defects in structure probably means it fulfills its requirements but has problems at the code level, possibly even failing certain non-functional requirements.
Note, I agree with the other points. I wish everybody cared a lot about the work they do.
In programming, there is always a strong back-and-forth between what people strive for and what they end up with because of reasonable tradeoffs. I find folks like Bob Martin usually concede that this is clearly the case, but they don't always do a good job of acknowledging that reality in the initial statement.
The open/closed principle is an example of a concept that I find is a nice ideal, but none of my work has really been able to get anywhere near complying with it on account of difficulty predicting the future, so I've found it to be a bit unhelpful in that it is rarely described as a good idea that might not come to be in practicality.
Perhaps some of these concepts would be best understood if the room for error was more baked into the initial wording rather than something tacked on when discussing the matter in practical terms.
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.