Skip to content
loading...

Why don't operating systems and browsers limit shared resources more strictly?

andy profile image Andy Zhao (he/him) twitter logo github logo ・1 min read  

Apps can consume RAM at high rates and the operating system doesn't always care. It can become a tragedy of the commons. The same is true for browser tabs and their relationship with memory and storage. They all throttle this to some extent, but issues still end up being the end-user's problem to deal with.

Just wondering why this is and any thoughts on where things might go in the future.

twitter logo DISCUSS (7)
Discussion
markdown guide
 

As of right now, Chrome tabs will crash if they use more than 1 GB of RAM. To me, this is actually a bug. Firefox will kill tabs that are block on the main thread for too long. As for RAM. I feel that tabs should be able to use as much as the system is willing to give it. A lot of work has been done to make it feasible for bigger and bigger applications to come to the web. And some applications do need more than 20k to get their job done. If we ever want to see a day where apps like Photoshop etc are web apps, then the browsers need to let them do their thing. Browsers could maybe to better in ways of warning users if a tab goes above a threshold for resources but the user should be the only final gatekeeper of their machine and how its resources are used.

 

Like the way browsers now make it clearer which sites use https and which do not, I think a resource
hog indication could be useful.

 

How do you think this might be implemented? As a notification the way mobile does? Or maybe some always-on-screen task manager-esque inidicator?

I'd think something like this could be sufficient, with more info on hover and the ability to ignore the warning.

You know how tabs have a little loudspeaker icon if they're playing sound? Maybe something like that if they go over a threshold of (say) > 50% resources, where "resources" means any of RAM, disk, IO, etc.

 

Throttling sounds like a great idea for Electron apps and browser tabs. Less so for video games and code compilers. I think the incentives here are for the OS to under-regulate rather than develop a reputation for being slow.

Moore's Law seems to have died in the water, while software development is becoming more and more abstract. Something's got to give, right? I could imagine a UAC-like system where programs can request the user's approval to use more than 5% of their CPU or RAM.

 

Mobile did this from the get-go—at least for memory, not so much storage. I'm very curious about why the desktop hasn't gone in this direction much.

Chrome now throttles RAM more than it used to, but things can still be kind of brutal, especially now with apps that each act as their own browser instance.

Classic DEV Post from Jul 30 '19

PublishTo.Dev: Scheduling article publishing on dev.to

Andy Zhao (he/him) profile image