I can relate very much to the feeling, that somewhere along the way something fundamentally changed in the field of computing.
I suspect, the reason for this and my inability to really pin it down, has something to do with a fundamental problem of grasping exponential growth. And that outlook also might be a generational experience. I started out with a then (1995) outdated 80286 (for which my uncle had no use anymore). And for the next years, each successor machine I owned more than doubled its immediate predecessor in clock speed and memory etc. But Moore's law has reached saturation a few years ago and neither the number of cores, nor the clock rate increased significantly, and frankly I stopped bothering.
I have more than once wondered, what would have happened if I had entered the field when the plateau was already reached. Had I ever gotten into programming, if it had not been a necessity to do anything really fun with my first computer?
Commercialization, as you rightly pointed out, is a double edged sword. Computers now are a commodity, or a utility even, more akin to electricity and water than to most physical products. It managed to lower the barrier to entry and raise it at the same time, because the industry found it to be more profitable to lure users into a permanently locked-in & vendor-dependent, albeit comfortable, position. It transformed what was a maker culture to a consumer culture.
Everybody in this community is firmly on the maker side, which also places us firmly and probably permanently in an absolute minority position. The majority is not to blame for lacking perspective, because we, as a field and an industry, have worked hard to create a silo. We are still, with the words of Bob Barton, the high-priests of a low cult. Call me naive, but I don't give up on the thought, that better ways of thinking about and doing things are still to be discovered.
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.