I think the issue isn't that developers use really powerful computers, it's that they don't test on cheap, low-end systems anywhere near as much as they should.
This is true and the reason is that all process of refusing to optimize further is totally unconscious. Yes, the programmer can be forced to test on slow machine. But even then, he will recognize as "acceptable" results that he would never accept on his own working computer.
And after recognizing the speed as "acceptable", nothing can make him to work further.
Actually the goal of my post is not to make everyone to downgrade his computer. (It would be great, but I am realistic), but to underline exactly the unconsciousness of this process and this way to help people to realize the effect and maybe to counteract it intentionally.
I encourage you to try building webkit-gtk locally
Well, I never compiled webkit, but recently I have worked on a patch for WINE, so the full build continued something like 10 hours. So I understand what you mean. But nevertheless the patch was committed successfully and now there is one bug less.
I'm a Systems Reliability and DevOps engineer for Netdata Inc. When not working, I enjoy studying linguistics and history, playing video games, and cooking all kinds of international cuisine.
It's not as purely unconscious as you seem to think, at least not for real world development tasks.
The thing is, as a general rule, optimization is hard (yes, there are some trivial cases, but they are in the minority). It's also very difficult to determine reliably if any given piece of code really could be more efficient than it already is. Together, this means that once you meet the minimum performance requirements for realistic usability, it's often a better use of your time to work on things that are guaranteed to improve the software than it is to chase that extra few milliseconds of faster load time, especially when the latter is far more likely to introduce new bugs in existing functionality than the former.
Now, where that minimum required level of performance is is not easy to figure out, and varies from project to project, sometimes wildly, and often over time too ( for example, if you're in a saturated market, the minimum level of performance you need is 'roughly as good as your most popular competitors'), but it's always there, whether you like it or not.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
This is true and the reason is that all process of refusing to optimize further is totally unconscious. Yes, the programmer can be forced to test on slow machine. But even then, he will recognize as "acceptable" results that he would never accept on his own working computer.
And after recognizing the speed as "acceptable", nothing can make him to work further.
Actually the goal of my post is not to make everyone to downgrade his computer. (It would be great, but I am realistic), but to underline exactly the unconsciousness of this process and this way to help people to realize the effect and maybe to counteract it intentionally.
Well, I never compiled webkit, but recently I have worked on a patch for WINE, so the full build continued something like 10 hours. So I understand what you mean. But nevertheless the patch was committed successfully and now there is one bug less.
It's not as purely unconscious as you seem to think, at least not for real world development tasks.
The thing is, as a general rule, optimization is hard (yes, there are some trivial cases, but they are in the minority). It's also very difficult to determine reliably if any given piece of code really could be more efficient than it already is. Together, this means that once you meet the minimum performance requirements for realistic usability, it's often a better use of your time to work on things that are guaranteed to improve the software than it is to chase that extra few milliseconds of faster load time, especially when the latter is far more likely to introduce new bugs in existing functionality than the former.
Now, where that minimum required level of performance is is not easy to figure out, and varies from project to project, sometimes wildly, and often over time too ( for example, if you're in a saturated market, the minimum level of performance you need is 'roughly as good as your most popular competitors'), but it's always there, whether you like it or not.