Dog Dad, Coffee drinker, Lover of Justice and Equity.
I'm a Devops Engineer working in AWS with PowerShell, Python, IaC, and more.
My side-projects deep dive into Linux and Docker.
@sleibrock This is valuable insight, you really show how complicated and ugly things get and fast... and it's true, our world of software and technology is a mess and you really bring clarity to a messy topic
But to talk to @mariamarsh real quick, there's a couple things I want to point out:
I assume you wrote this in reference to Windows computers. Windows is my background and it's honestly a frustrating and messy OS, for many reasons I won't get into. It's not Linux, and Definitely not the prime pick to run containers in. But, it is an extremely complicated software platform where I've touched things I didn't even know existed that affect things I didn't even know X component could. However, since windows 7, they've much improved their act and it's a much more reliable platform.
Where are you getting your 99.9% figure from? How do you calculate resource waste? Is this in regards to software that runs on windows, or you talking about the OS itself? Or both?
I don't think, necessarily, file-size is the End-All, Be-All indicator of resource-usage/waste. For example, compare the file size between the exact same .csv and .xlsx(Excel) file, which one is smaller? File compression, and other factors play into this.
Libraries. Yes, There are built-in libraries in the OS, but that's not the end of the story. What about different Versions of the same library/software? I think it was Windows xp where you had to download different versions of .Net and .Net components for every piece of software. But the same thing applies to every language and dependency version.
Version mismatch and Dependency Hell are very real things that cause issues, from personal to corporate environments to this day. All software is built on software before it, and once you dive deep into what dependencies everything is built on, you'll never reach the bottom. Do you remember how one developer removed left-pad from NPM and broke the internet? That's the situation we are in with all of our software on any operating system. This is a joke tweet, but also, so very true:
"the most consequential figures in the tech world are half guys like steve jobs and bill gates and half some guy named ronald who maintains a unix tool called 'runk' which stands for Ronald's Universal Number Kounter and handles all math for every machine on earth" - twitter.com/6thgrade4ever/status/1...
The last thing is, if you are upset with windows, try linux if you can(There's some software that cannot be ported easily or doesn't have an equivalent counter-part). There's never been a better time to run linux on desktop, or headless, or on ARM devices. It definitely outperforms Windows In regard to Resource-Management.
I'm 25 y.o. Expert Web/App Design & Development with 7+ years of experience.
Love my π Muffin and banana ice cream. Practice running & yoga in my spare time. πSupport me: https://ko-fi.com/mariamarsh
This is a big problem with any modern operating system, whether it's Linux or Windows.
XUbuntu eats up 500-800 MB just after startup, and it needs 1.5-2 GB for some significant work. Win2K startuped and ran with128MB, WinXP with 256. Well, that's not entirely fair, because they were 32-bit: just to CALL to the full address, we need a 64-bit address, but XUbuntu has a difference even with WinXP by 8 times. In fairness, a workstation on Linux still doesnβt eat up more than the conventional 1-2 GB after startup, but Win10 / 11 can easily eats 2-3.
On Linux, all problems manifest themselves in exactly the same way, just try to build any open source project, it will immediately pull billions of the same open source libraries to itself. And many of them are needed only for the sake of one or two functions.
If it were not for the SSD, the available RAM, and the hardware instructions in the processors and their multithreading, the operation of computers running any modern OS would be a sad sight.
The main resources are eaten not by a bare machine, but by applications on it. Websites are almost entirely crap code, and one page "weighs" a hundred megabytes. This needs to be optimized on the server side, otherwise nothing. IDEA, VSCode and a bunch of other applications eat about the same (a lot) almost regardless of the OS. Another example for you is Jetbrains Toolbox, a little application for downloading and updating the IDE. It eats up 200-500 MB of RAM. What? How? Why?
Dependency hell can also exist in linux, I would not put 2 different versions of openssl, or libjpeg without "dances with tambourines". Look at the NPM and Composer dependencies of any site. Previously, jQuery was enough of all JS, but what about now? NPM folder can easily reach several gigabytes, and then from too many files the collector will fail and fall, great!
What about 99.9%, maybe I'm exaggerating, but the absolutely irrational loss of resources applies to both software and OS.
Thank you for your questions π€
I also advise you to read the comments of other users, there are a lot of interesting thoughts and opinions π
Dog Dad, Coffee drinker, Lover of Justice and Equity.
I'm a Devops Engineer working in AWS with PowerShell, Python, IaC, and more.
My side-projects deep dive into Linux and Docker.
"If it were not for the SSD, the available RAM, and the hardware instructions in the processors and their multi-threading, the operation of computers running any modern OS would be a sad sight." Well, yes, things are written for the current hardware standards of the day. People(developers, commuters, pedestrians) will "fill the space" of where they are. People naturally use the tools at their fingertips.
Nothing you've described is particularly new to me, but it feels like you are just describing the state of software in 2022. So, since I'm not sure what you are comparing everything to, I have to ask:
What do you think the state of software should look like?
What does good resource management look like to you, both from an OS perspective and a software perspective?
What do you think are reasonable specs for computers(cpu, ram, HDD space, etc)?
P.S. - Linux Dependency hell is particularly frustrating because if you try to update your packages, and one of those packages was installed by pip / is dependent on something installed by pip, the package manager could fail to update anything.
I'm 25 y.o. Expert Web/App Design & Development with 7+ years of experience.
Love my π Muffin and banana ice cream. Practice running & yoga in my spare time. πSupport me: https://ko-fi.com/mariamarsh
It feels like you are a very curious young man π
$500 and we'll face you in a 1v1 Discord battle to see who wins, the Dark Side or the Light Side π΄βοΈπYou will be in the role of Darth Vader πΎ
But I have a condition: I will take my father Chewbacca with me π€£
Some of these conversations can be tagged under the "static linking versus dynamic linking" category and others probably file under "software bloat". What do you think your approach to application development is with respect to static/dynamic linking? Ship with deps, or ship targeting deps on a host environment?
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
@sleibrock This is valuable insight, you really show how complicated and ugly things get and fast... and it's true, our world of software and technology is a mess and you really bring clarity to a messy topic
But to talk to @mariamarsh real quick, there's a couple things I want to point out:
I assume you wrote this in reference to Windows computers. Windows is my background and it's honestly a frustrating and messy OS, for many reasons I won't get into. It's not Linux, and Definitely not the prime pick to run containers in. But, it is an extremely complicated software platform where I've touched things I didn't even know existed that affect things I didn't even know X component could. However, since windows 7, they've much improved their act and it's a much more reliable platform.
Where are you getting your 99.9% figure from? How do you calculate resource waste? Is this in regards to software that runs on windows, or you talking about the OS itself? Or both?
I don't think, necessarily, file-size is the End-All, Be-All indicator of resource-usage/waste. For example, compare the file size between the exact same .csv and .xlsx(Excel) file, which one is smaller? File compression, and other factors play into this.
Libraries. Yes, There are built-in libraries in the OS, but that's not the end of the story. What about different Versions of the same library/software? I think it was Windows xp where you had to download different versions of .Net and .Net components for every piece of software. But the same thing applies to every language and dependency version.
Version mismatch and Dependency Hell are very real things that cause issues, from personal to corporate environments to this day. All software is built on software before it, and once you dive deep into what dependencies everything is built on, you'll never reach the bottom. Do you remember how one developer removed left-pad from NPM and broke the internet? That's the situation we are in with all of our software on any operating system. This is a joke tweet, but also, so very true:
"the most consequential figures in the tech world are half guys like steve jobs and bill gates and half some guy named ronald who maintains a unix tool called 'runk' which stands for Ronald's Universal Number Kounter and handles all math for every machine on earth" - twitter.com/6thgrade4ever/status/1...
This is a big problem with any modern operating system, whether it's Linux or Windows.
XUbuntu eats up 500-800 MB just after startup, and it needs 1.5-2 GB for some significant work. Win2K startuped and ran with128MB, WinXP with 256. Well, that's not entirely fair, because they were 32-bit: just to CALL to the full address, we need a 64-bit address, but XUbuntu has a difference even with WinXP by 8 times. In fairness, a workstation on Linux still doesnβt eat up more than the conventional 1-2 GB after startup, but Win10 / 11 can easily eats 2-3.
On Linux, all problems manifest themselves in exactly the same way, just try to build any open source project, it will immediately pull billions of the same open source libraries to itself. And many of them are needed only for the sake of one or two functions.
If it were not for the SSD, the available RAM, and the hardware instructions in the processors and their multithreading, the operation of computers running any modern OS would be a sad sight.
The main resources are eaten not by a bare machine, but by applications on it. Websites are almost entirely crap code, and one page "weighs" a hundred megabytes. This needs to be optimized on the server side, otherwise nothing. IDEA, VSCode and a bunch of other applications eat about the same (a lot) almost regardless of the OS. Another example for you is Jetbrains Toolbox, a little application for downloading and updating the IDE. It eats up 200-500 MB of RAM. What? How? Why?
Dependency hell can also exist in linux, I would not put 2 different versions of openssl, or libjpeg without "dances with tambourines". Look at the NPM and Composer dependencies of any site. Previously, jQuery was enough of all JS, but what about now? NPM folder can easily reach several gigabytes, and then from too many files the collector will fail and fall, great!
What about 99.9%, maybe I'm exaggerating, but the absolutely irrational loss of resources applies to both software and OS.
Thank you for your questions π€
I also advise you to read the comments of other users, there are a lot of interesting thoughts and opinions π
"If it were not for the SSD, the available RAM, and the hardware instructions in the processors and their multi-threading, the operation of computers running any modern OS would be a sad sight." Well, yes, things are written for the current hardware standards of the day. People(developers, commuters, pedestrians) will "fill the space" of where they are. People naturally use the tools at their fingertips.
Nothing you've described is particularly new to me, but it feels like you are just describing the state of software in 2022. So, since I'm not sure what you are comparing everything to, I have to ask:
P.S. - Linux Dependency hell is particularly frustrating because if you try to update your packages, and one of those packages was installed by pip / is dependent on something installed by pip, the package manager could fail to update anything.
It feels like you are a very curious young man π
$500 and we'll face you in a 1v1 Discord battle to see who wins, the Dark Side or the Light Side π΄βοΈπYou will be in the role of Darth Vader πΎ
But I have a condition: I will take my father Chewbacca with me π€£
Some of these conversations can be tagged under the "static linking versus dynamic linking" category and others probably file under "software bloat". What do you think your approach to application development is with respect to static/dynamic linking? Ship with deps, or ship targeting deps on a host environment?