DEV Community

jmau111
jmau111

Posted on • Originally published at blog.julien-maury.dev

Please remove that .git folder

Have you ever tried to browse http://yoursite.com/.git/?

If you get a 403 error, that's normal. It means directory browsing is disabled, which is basic security. However, many files in the .git/ folder could be accessible, putting you at risk.

.git/ folder leakage: easy exploit

N.B.: use the tool below at your own risks

GitHub logo WangYihang / GitHacker

πŸ•·οΈ A Git source leak exploit tool that restores the entire Git repository, including data from stash, for white-box auditing and analysis of developers' mind

GitHacker

Desciption

This is a multiple threads tool to detect whether a site has the .git folder leakage vulnerability. It is able to download the target .git folder almost completely. This tool also works when the DirectoryListings feature is disabled. It is worth mentioning that this tool will download almost all files of the target git repository and then rebuild them locally, which makes this tool State of the art in this area. For example, tools like [githack] just simply restore the latest version. With GitHacker's help, you can view the developer's commit history, which makes a better understanding of the character and psychology of developers, so as to lay the foundation for further code audition.

PROCLAMATION (IMPORTANT)

Several VULNERABILITIES have been reported recently, if you are using GitHacker <= 1.1.0, please update your tool as soon as possible.

The remote .git folder maybe malicious, so to prevent you from…

Anyone can use automated scripts such as the above repository to download your source code and view the entire git history. Git is also a filesystem that follows some conventions, so you can guess directories and files easily.

Most projects use master or main as master branch, so it's easy to guess "hidden" paths in the /.git/ folder. Note that the tool can even brute force branches and tags if that's necessary.

If the scan succeeds, you get a result folder on your local machine (you can customize the folder name with the --output-dir option).

A typical result is the equivalent of git checkout master for free!

Don't deploy the .git/ folder or, at least, forbid access

The .git/ folder can contain lots of information, including the source code itself but also names, mails, and, in the worst-case scenario, hard coded credentials (e.g. databases, tokens, keys).

For a hacker, it's like Christmas!

You should completely disable public access to such folder. Modern CI/CD and deployment solutions are relatively easy to configure and allow cleaning such directories that have nothing to do with the production environment.

Note that some web hosting providers disable access to the folder for security purpose, but it's not always the case and it's not the default configuration, so check it before deploying anything.

I recommend doing all hardenings available. While it might seem a bit overkill, it's often a good idea to take into account any misconfiguration that could occur in the future or a miscalculated migration, so:

  1. Disable public access in the .git/ folder by default on your server
  2. Add a rule to forbid access to the folder in your source code, for example, in the .htaccess file for Apache configurations
  3. Don't even deploy such folder in public directories if that's possible

If you don't want to touch sensitive files such as the .htaccess, you can add a smaller .htaccess at the root of the .git/ folder on your server with just the following line inside:

Deny from all
Enter fullscreen mode Exit fullscreen mode

However, it's even better to return a 404 for the .git/ folder in your server config or main .htaccess, so hackers won't be able to guess anything:

RedirectMatch 404 /\.git
Enter fullscreen mode Exit fullscreen mode

Again, I recommend adding both rules if you can, as two layers of security. In case someone modifies the main .htaccess and delete the rule accidentally, there's still a fallback in the .git/ folder.

When migrating from one server to another, misconfigurations and oversights do happen.

Discussion (36)

Collapse
corentinbettiol profile image
Corentin Bettiol

Here's my config on my website:

<VirtualHost *:443>

        ServerName ***
        Serveralias ***

        ServerAdmin ***
        DocumentRoot /var/***/***/***/***

        Options FollowSymLinks MultiViews

        ErrorDocument 404 /***/404
        RedirectMatch 404 /\.git  # <-- THIS LINE HERE

        [...]
Enter fullscreen mode Exit fullscreen mode

And it works well :)

It works!

Collapse
corentinbettiol profile image
Corentin Bettiol • Edited on

(I use it in my sites-enabled/ folder because adding .htaccess files are slowing down apache and my server is not very powerful :P)

You should avoid using .htaccess files completely if you have access to httpd main server config file. Using .htaccess files slows down your Apache http server. Any directive that you can include in a .htaccess file is better set in a Directory block, as it will have the same effect with better performance.
httpd.apache.org/docs/2.4/howto/ht...
​
The httpd.conf is parsed one time. If you use .htaccess it'll get hit every time something is called. That'll cause a fairly large performance hit that will just get worse with increasing requests.
stackoverflow.com/a/25064109/6813732

Collapse
jmau111 profile image
jmau111 Author

hello, the good news is you can manage it wherever you want, but do you have any figures or test results for the performances? I'd be curious to know why .htaccess would slow down the entire architecture, especially your number of hits.

Thread Thread
corentinbettiol profile image
Corentin Bettiol • Edited on

I haven't done any test on my server, but I will quote another SO answer:

Also, when [.htaccess is] enabled the server will take a potential performance hit. The reason is because, every server request, if .htaccess support is enabled, when Apache goes to fetch the requested file for the client, it has to look for a .htaccess file in every single directory leading up to wherever the file is stored.
stackoverflow.com/a/29114826/6813732

Collapse
jmau111 profile image
jmau111 Author

Nice!

Collapse
dendihandian profile image
Dendi Handian

this is a kind of fear that would happened if I build a code from scratch by myself without understanding web security. for the security, I always rely on frameworks.

Collapse
jmau111 profile image
jmau111 Author

Hi, sorry I made a wrong manip and hide your comment by accident 🀦. It's now available again.

You're right, and I've seen such security misconfig a lot with beginners, but it can happen to anyone if you only rely on 1 layer of protection.

Collapse
rx40 profile image
Petrus-Nauyoma

I totally get you. Imagine leaking api keys in the env folder. Frameworks really help to get on your feet a bit quicker. Otherwise all these little but disastrous security details are hard to come by especially without a patient senior mentor/dev.

Collapse
mrwensveen profile image
Matthijs Wensveen

My advice would be to never have the website root at the project root. Always use a www, html, public, or even src subdirectory. This way, the .git directory and files like README.md are not exposed to the internet.

If you deploy by using git pull on the server and your hosting provider only provides a webroot, this article has good advice. πŸ‘

Collapse
jmau111 profile image
jmau111 Author

Totally on point: no public access! I wrote this post for those who do not have that in mind. In fact, even if you are on a budget, you don't have to deploy such folder. It's just more convenient for many people, but not mandatory. There are other ways to sync you code.

The problem with cybersecurity is you don't always have the best conditions, so many people will tell you "don't put anything sensitive in git, etc," which it's true in a perfect world, but sometimes more difficult to achieve in reality.

I prefer having several layers, and if I can remove that .git folder from public folder, I'll do it :)

Collapse
mrwensveen profile image
Matthijs Wensveen

Absolutely! Sometimes hosting providers don't give a lot of options on how or where to deploy. Or maybe someone isn't even aware that this is problematic.

Maybe a web.config equivalent of this article would be useful as well, for those who use IIS.

Thread Thread
jmau111 profile image
jmau111 Author • Edited on

Nice. I rarely use IIS in my projects but that's a rich idea!

Collapse
phlash909 profile image
Phil Ashby

Good layered defence approach! I would also advocate for automating deployment to always use a safe export mechanism (eg: git archive) that omits the internals of the source control.

For cloud-hosted deployments, your provider of choice will likely have an arsenal of tools to help deploy stuff (typically workflow pipelines, secret management, versioning, auto-rollbacks, ...).

Collapse
jmau111 profile image
jmau111 Author • Edited on

Indeed. Nice suggestion.

It's true that providers offer an extensive range of configurations and tools to automate deployments, but it's kinda the same problem if you misuse them or misconfigure security settings.

Besides, if hackers manage to steal access to those interfaces, which happens a lot, it's game over too.

In a nutshell, I agree with you on automation but admins should be careful with their credentials and ensure they understand how the cloud-based system works.

Collapse
slidenerd profile image
slidenerd • Edited on

Some whitehat hacker reading this should run a brute force bot to traverse every website out there and make a list of the ones that did hit the .git folder, extract their admin emails and send them an automated message or at the easiest make a list of them similar to haveibeenpwned.com i think

Collapse
jmau111 profile image
jmau111 Author • Edited on

yeah, people always forget that, in git data, hackers do not look for hard coded keys or database credentials only, there are so many valuable information you can get...

Unfortunately, while your suggestion could make sense, it's hard to determine whether you have a white or a grey hat approach here (I would say grey). If the website does not include any security.txt or does not explicitly invite white hackers to test the website (e.g. bug bounty) and contact them, it might be considered illegal in many countries.

Collapse
seancassiere profile image
Sean Cassiere

It's funny, because I've been using cloud providers for deploying my frontends, with the backends requiring me to allow folders to be publicly accessible, and therefore haven't dealt with/or even though about this aspect of security before.

Could definitely see this affecting beginners and students, that use static plain HTML or PHP websites deployed.

Collapse
jmau111 profile image
jmau111 Author

Yes, it's easy to git pull your project and forget that :(

Collapse
darkain profile image
Vincent Milum Jr

A better strategy: don't put sensitive credentials in your git repos to begin with.

The .htaccess recommendation is limited in what servers even support that directive. There are methods to do it with others, but this becomes a separate security nightmare of making sure infrastructure never changes over time (which is within itself a different bad practice)

Security credentials should be stored using a dedicated secrets manager outside of the repo. This is especially true because dev/staging/test/prod/etc should all have separate credentials ANYWAYS, so mish-mashing them all together in the repo with conditions isn't best practices to begin with.

Collapse
jmau111 profile image
jmau111 Author • Edited on

That's why I said "in the worst case-scenario" when I said "hard coded credentials." It's not a better strategy, it's another layer of security.

Collapse
mikcat profile image
mikcat

I'm used to add this to the main apache config:

<DirectoryMatch "/\.git">
   Require all denied
</DirectoryMatch>
Enter fullscreen mode Exit fullscreen mode
Collapse
jmau111 profile image
jmau111 Author • Edited on

EDIT: your example is better than mine.

Collapse
lokaimoma profile image
Kelvin Clark

Wow this is very informative.

Collapse
alexwinder profile image
Alex Winder

I would suggest rather than ignoring the .git folder a better solution would be to structure your projects so that your application is only accessible from a subfolder, such as public. Then in your Apache, Nginx, or other web server you just simply set the DocumentRoot to that public directory. This folder then won't have any of the .git or any other folders which could have potentially compromising information.

Collapse
jmau111 profile image
jmau111 Author

hello,
the post recommends you don't expose the .git folder publicly if you can. In case you can't for some reasons, there is fallback, as an additional layer of protection, to ensure you are not in danger.

Not everybody get access to Apache configurations unfortunately, so it's best to provide security for these unsafe situations too.

Collapse
alexwinder profile image
Alex Winder

Yes I agree however I wouldn't say that this removes all danger. The post doesn't go into other things which may be vulnerable, a few examples include a .env, a vendor directory, or log files which be getting returned from another directory in your project. Just preventing access to .git is just security by obscurity. A much better solution would be to only grant access to the files/directories which are required for the application to be accessible to users.

I appreciate that not everyone will have access to the web server configuration but if that is the case then you should perhaps consider finding a host which either allows such configuration or host your own. Not all developers want to learn hosting, but if you are working on your own personal projects then it would put you in good stead to understand how they work.

Thread Thread
jmau111 profile image
jmau111 Author

It's absolutely not security through obscurity. I think the term might be confusing to you. Besides, it's not a full security guide on config files.

It focuses on one aspect that is often neglected and very easy to exploit. It's a practical lab with a concrete example of exploit.

The "you should get a better hosting, do it like that" etc is fine, and I'm not saying you're wrong, but it's not very helpful at end of the day. Yes, we all want better providers, better approaches.

However, in reality, you have to deal with people with various backgrounds, that's why defending in layers is efficient. You use additional layers to avoid unwanted disclosures in case of misconfiguration.

I don't trust other layers blindy.

Thread Thread
alexwinder profile image
Info Comment hidden by post author - thread only accessible via permalink
Alex Winder

This example is exactly security through obscurity. The files are still there, you are just avoiding access. What happens if your host decides to disallow/override htaccess files, all of a sudden your .git directory is fully accessible to the world again. Or what if there is a bug in the web server? Or perhaps you mistype your htaccess? Or perhaps your host has introduced a bad configuration on the global web server? Or perhaps you have uploaded some other vulnerable code where the user gets access to the DocumentRoot and now they have access to everything?

If you have lots of different files and directories which are compromisable (as explained in my last comment) then you need to adjust your htaccess - more configuration changes means more chances of error means higher chance that your valuable files get leaked.

By rejecting access through your web server doesn't mean that you are completely covered. There are vulnerabilities and RCE/CVEs which are discovered daily. The simplest method would be to not include the vulnerable files at all.

Security is improved by either reducing the attack surface, or by introducing the number of layers required to reach the valuable information. Bringing it back to the original point this post gives a good example of how you can avoid the issue from including a .git directory, but it is far from an ideal solution.

Collapse
bosz profile image
Fongoh Martin T.

Thank you

Collapse
ricardochan profile image
Ricardo Chan

Great tip!

Collapse
knighttechwork profile image
Randy Knight

Thanks for sharing this excellent git security information and hardening tips.

Collapse
pandademic profile image
Pandademic

Lovely article!

Collapse
christiankozalla profile image
Christian Kozalla

I suppose that you'd only expose your .git directory if there is no build step before deployment or if you'd put the .git directory into the public directory?

Collapse
christiankozalla profile image
Christian Kozalla

I have to admit: I cannot intentionally deploy my .git directory, even if I try! So could you please explain to me, how I would get into such a situation?

Collapse
jmau111 profile image
jmau111 Author • Edited on

when deploying your website on the server with a basic git pull and the .git/ folder is publicly accessible. It happens a lot.

Thread Thread
christiankozalla profile image
Christian Kozalla

Ah, I see!

Some comments have been hidden by the post's author - find out more