Only code goes into a repository, right?

github logo Updated on ・1 min read

I was wondering how you deal with API keys, config files, precompiled binaries, test data or other environment specific data or data that is too large? Also, those Word documents, PDFs and Emails, that should be kept too.

If it is a small project, the easiest would be to commit almost everything to the repository, and keep some of my files in other places like Dropbox or Google Drive or just simply save them on the local hard drive. But this does not scale to larger projects or projects where more people are involved.

There has to be a better way. Is there? How do you manage your files?

twitter logo DISCUSS (8)
markdown guide
 

That's a great question.

For API keys and all sensitive configuration files we use KeePass. And we have a script to put those files back in the project once we've checked it out of source control. That one I think we got right.

Other than that we have:

  • A project folder on a shared drive. Specs and design documents go there.
  • Files ending up attached to a ticket in a ticket system (Jira / Trello / ...)
  • Email attachments
  • Files on people's drives

There is definately room for improvement. If you want to find a certain document, ticket systems and private folders are not going to help.

For myself I try and copy as many files to my docs/{project}/{issue} folders. Indeed, I'm one of the files on people's drives guys. I'd better share that resource.

I wonder if any company has it figured out. So I would be happy to learn how others do it as well :)

 
 

Here's a discussion on how orgs typically keep their secrets:

dev.to/ben/how-does-your-organizat...

And while it's an atypical approach, you could encrypt and check your secrets in:

dev.to/davidk01/encrypt-and-check-...

As @courier10pt alluded to , git's a pretty generic tool where you could keep anything there. Deciding what should go there is kind of a matter of figuring it out over time between you and your collaborators.

 

It's a too broad question, you should be more specific, there are many factors when you consider resources:

how often they change
who needs them

Ex a minified JS is bad if changes every 10min, a PDF document for developers eyes only is ok, but if is needed by an outsourced QA is not, a 200mb binary file that never changes is ok by me, if you have 30 files maybe dropbox is better.

My rule is common sense.

 

Look at Symfony way to manage it.

Keep in mind there's two files :

A local one : parameters.yml -> you store api key used for code.
A distant one : parameters.dist.yml -> you store default value like 'toOverride'

When you pull a new parameters.dist.yml version, you have to modify your local parameters.yml manually or by using "composer install" in CLI.

This solution scale with every project, new staff just have to copy your parameters.yml (using a chat or irl) and modify for their own usage.

Any file with personnal information should not be pushed to repo, even if it's private !

 

Love this question and it's a question I keep asking myself every time I start a new project and want to put all this in a git repository.

As a rule of thumb I try to only have code related files in my repo: html, css, js, script, config and whatever language I'm using for my backend.

But when it comes to API key and config content, that can and will probably change when deploying so I tend to have those values stored somewhere and replaced whenever I want, may it be when building, launching or deploying. I have been doing c# lately and we use octopus for this matter.

For binary files, my opinion would be never put them in your repo, mainly because I like my repos to be light and easy to clone. Trying to use a CDN to have your images and large files stored somewhere you don't have to worry when you deploy can be a plus and it's the solution I would prefer for web development.

For team related documents and documentation I would have a stronger opinion and say DO NOT use your repo for this. And the main reason for this would be that non technical people would want to access and use those. So Google Drive or Dropbox seem to be a good fit for those types of files.

Hope this helps a bit :)

 

If your member team is not familiar git or hg, use dropbox. Dropbox has history versioning, use it for emergency case

 

Dropbox has access to the keys that encrypt the files you put on it. Now you have to encrypt your files first and then store them on Dropbox. Then how do you pass the secrets required to decrypt?

I prefer to use GPG solutions to send sensitive data. The only problem is that people are scared of a lot of the tools. Keybase is a decent tool so far, but it's got some warts still from the UI side. (It's way easier if you understand some of its design decisions re: GPG).

Our solution for long-lived secrets is PwSafe, and rotating its access key frequently (I think we're up to monthly) and then disseminating that key to our team via GPG.

For deployment stuff, we're mainly just using KMS keys and super restrictive roles. In our long-term CD strategy, we're working on choosing a more scalable solution (like Vault)

Classic DEV Post from May 18

Aim to Never Stop Learning

It's OK to not know everything. It's OK to be wrong.

Florian Rohrer profile image
such software.. much wow!