re: The node_modules problem VIEW POST

FULL DISCUSSION
 

Why copy it at all? The power of NPM is the you can recreate the exact environment described in your package.json file with one command. Copying the node_modules directory is pointless.

 

I ended up deleting all node_modules folders before copying it because it would be impossible to copy otherwise, but my first thought was convenience, I see my projects folder with dozens of folders of projects, the first thought is to select everything, zip it and copy to my backup location.
My main point in the text is that there are other languages that deal with it better, providing what you said, recreation of environment without adding a dependency folder in every project with thousands of files and folders.

 

If your code is under source control what is the need for copy them for backup?

I think that's where your problem lies and then your article is about one of the symptoms when not using source control.

Node_modules is not designed for moving around. With all due respect, but I think you're just using it in a way it was never intended.

I never intended to move them around, it was a natural action to copy & paste all folders in the project folder, after I remembered that I canceled the copy, deleted all of them and started the copy process again. However, the problem still remains, it has too many files and it is heavy to download and install. NPM tends to take a long time here to finish. I think my point remains.

I understand that you come from a different environment Java/C#. This was also my first languages before I learned Javascript through jQuery/React/Nodejs. I had to learn that there are different philosophies in play here.

Most of the modules are made by hobbyists that do it on their free time. There are great ways to minimize libraries, e.g. aviod duplications by using peerDependency in package.json or make sure you strip away unnecessary files during publish. But, you will have to get use to that there will be a lot of small modules (unix style: doing one thing and doing it great).

Instead of working against the framework/community/tools please try to understand it first and then try to come with improvements in the form of PRs, issues and encouraging posts about how to do it right. I don't think your current attitude will help you succeed in becoming a better Nodejs developer. Because the community need more good developers. Thank you for understanding.

I put a bunch of my projects in Dropbox, and it spends TONS of time syncing node_modules which is completely pointless. If anyone knows if there's a way to tell Dropbox to ignore node_modules folders I'm all ears...

I do use git in certain cases but I don't always need a git repo for every JavaScript project.

I understand. But, I would not recommend you using Dropbox as source control, because you run into the problems as you just mentioned previously.

When using Azure DevOps, like in my case, adding a git repo can be as fast as creating a project and saving it to Dropbox.

PS: Dropbox gets full fast (not paying for it)!
PPS: Use a .gitignore file but I guess you knew that.

@mroggy85 :

"If your code is under source control what is the need for copy them for backup?"

"I think that's where your problem lies and then your article is about one of the symptoms when not using source control."

Not every project is meant to be on GitHub and a local repository won't fix the problem.

The problem is that you have to delete all the node_modules folders manually in a up to infinity number of project folders and then re-install them after moving.

Maybe you can do that with a little cli magic, but that's not really user friendly and will take it's time...

I understand that it can be a litte inconvenience that one time you move your project. But I would not call it a common use case moving project files around. Then, I would suggest that you review your development process instead.

There are lots of steps between github and a local repo. For example, for small one-off projects I'll start with a local repo and then clone it on my other dev machine (each repo is a remote for the other, so I can easily sync changes back and forth). As it gets bigger, it can be pushed to the gitolite install I have on my media PC (which was just a random machine I had lying around that was always on; a Raspberry Pi would do just as well). Only when I want other people to look at it do I push to Github.

Even if you're just using local repos, git still helps you out; you only need to back up the .git directory; to restore, you put it back in place and run git checkout HEAD. No backing up of node_modules necessary!

 

Backing up your computer or replacing your hard drive is not working against the framework/community/tools. Those are common tasks, and if a particular tool makes them awful for users because of some design decisions, then the users are not to blame and are allowed to complain.

 

I do it when I archive my projects because there is no guarantee that npm will be there in say 20 years from now.

code of conduct - report abuse