This is due to the fact that we use high-level languages where CPU and RAM are totally abstracted.
It's a bit more subtle, but it's certainly part of the problem. From another angle, abstracting the hardware resources is the way to get super portable code, so there's a tradeoff even if you don't consider developer productivity.
Just to clarify, I'd argue it's the behavior encouraged by high-level languages that's more dangerous than the language's performance characteristics. As a very small example, some garbage collectors can have negligible overhead, but kill you when you hit bad edge cases. If you didn't know that, you'd never think of it; and a garbage-collected language does exactly that, encourage you to forget the garbage collector exists.
When C came out it was considered a high-level language too. ;-) That aside, Haskell is very high-level and gets close to C performance (GHC compiles to machine code, has full type erasure like C or C++ etc). I think it has more to do with the fact that people got complacent and assumed that Moore's Law will work in their favor forever.
But with that, then we should technically all be programming in assembly then. Find your target machines instruction set and let's program using that.
When C came out it was considered a high level language. You would use C but never for anything that mattered in speed. In time that turned. Same with C++ when it came out.
There are PLENTY of systems you use every day that have great performance but are using high level languages such as Python, C#, and more.
The thought that a high level language can't make code that is optimized is old and should be thrown away in my mind.
That's the thing with abstractions: they have their costs, but at the same time they reduce the effort of resolving lower-level problems which enables us to address higher-level problems.
Computers as a concept work like this:
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.