Node has been around for more than a decade. Developing on it has been fun, tho one of the main drawbacks has always been it's spacious package management.
In 2020, create-react-app would create over 30k files, close to 200 MB in total. Even without installing it via boilerplate, depending on the selected dependencies, the local node_modules folder can get pretty heavy quickly.
So, what if you have like hundreds of projects?
Do you keep the node_modules folder for archived (completed) projects or do you delete it and do npm install if you need to re-visit the particular project later to save space?
Or have you already switched to Ryan's newest offering - Deno, that imports and uses dependencies directly from URLs?
Top comments (14)
I use pnpm. Like deno it uses a "central storage" where it keeps all the packages and never downloads the same package version twice. The folder
node_modulesstill exists in your local project but it will be filled with symlinks and other magical stuff that point to the central storage in your disk.
thanks, that's a nice approach 😉
8k stars and somehow I missed it, lol 😁
There's a package for that:
npkill - The solution to deleting node_modules easily, with style
Carlos Caballero ・ Aug 8 '19 ・ 3 min read
Hahah! The inony! 😂😂
That cover image is hilarious 😅
I only realize how big the project's
node_moduleshas become when I am backing up my work folder. At that point I usually delete it, especially if the project has a
package-lock.jsonfile, which at this point most do.
For me it has been by using this command:
node_modules are the bane of everything. They bring in huge amounts of executable code which we know nothing about. They are full of dependencies, simple upgrades can cause lots of issue losing weeks of time. Error messages are ridiculous. etc. etc.
The way I deal with this is to really know the NPM commands, In particular NPM info pacakageName, and NPM ci (when things are bad). npm search helps. And knowing when to ignore peer dependency errors.
Also there are times to delete the package lock file as it has a memory which will override our intent.
I currently actually bite the bullet, and have multiple copies of
typescript, even in a monorepo. (I don't use Lerna or Yarn workspace, if I want to avoid shared dependencies, especially between frontend (run by browser) and backend (run by Node.js).)
I am trying to ask the question, here.
I would also consider pnpm; but I have to be absolutely sure it doesn't break in CI or Docker.
I keep my code base on github. I have no need to keep my code on local drive all the time. Whenever I need to make some changes I clone my repo, install dependencies with npm, make changes, push them back to repo. Done. If I am not working on particular project I don't keep them on my disk.
I recently started using pnpm, but there are a few utilities out there to clean up node_modules directories. You can even do it with a shell script, just becareful not to delete actual third-party node programs node_modules.
autarky and I rarely even think about node_modules, perhaps only if I want to tweak the dependency to make a PR or learn a use case. With Typescript the latter has reduced considerably.
I deal with it by not using anything from react ecosystem. Seems to work just fine.
Also, you could take a look at yarnpkg.com/getting-started, yarn since version 2 has started to address how to handle dependencies as a whole.