A Senior Developer working mostly with PHP and JavaScript, with a bit of Python thrown in for good measure, all on Linux. My tooling is simple, it's GitLab and JetBrains where possible.
Whilst it would still initially be slow, if the large files rarely change you could put them in their own repository, and include them as a git sub module. That way you get the performance you would expect from git in your main repo, whilst being able to version large files
Good point, haven't tried this before. Probably it might work for smaller datasets, but will start breaking at multi GB data because of Git servers limits in terms of overall Git repo size.
A Senior Developer working mostly with PHP and JavaScript, with a bit of Python thrown in for good measure, all on Linux. My tooling is simple, it's GitLab and JetBrains where possible.
Whilst it would still initially be slow, if the large files rarely change you could put them in their own repository, and include them as a git sub module. That way you get the performance you would expect from git in your main repo, whilst being able to version large files
Good point, haven't tried this before. Probably it might work for smaller datasets, but will start breaking at multi GB data because of Git servers limits in terms of overall Git repo size.
Sub modules are an interesting beast of their own. But yes, multi gigabyte repos will still cause issues