Cover image for Automatic Deployment via good ol' FTP

Automatic Deployment via good ol' FTP

devmount profile image Andreas ・4 min read

Since their release, GitHub actions are on my long-term todo list for increasing automation of my workflows. Thanks to DEVs GitHub Actions Hackathon, I'm finally tackling this topic.

I'm not really sure if it's a thing to be ashamed of today, but I'm still pushing build files of most of my personal open source projects manually via good ol' FTP to my server. Maybe I just don't wanted to give up too much control over the files that I push to production. Or after doing web development for more than 15 years now, I was just too lazy to change something 😅

However, I found an awesome GitHub action to publish files automatically via FTP to my server on build.

My Workflow

It's the FTP-Deploy-Action by Sam Kirkland which utilizes Git-ftp. I'm mostly creating Vue.js applications with the Vue CLI - so my normal workflow always looked like this:

  1. ➕ Make code changes (e.g. fixing an important security issue)
  2. 🔨 Test code changes
  3. ✅ Commit these changes to the repository
  4. 🔁 Create new build files optimized for production using vue-cli-service build
  5. ❌ Delete old build files from production server
  6. ⏫ Upload new build files to production server

Especially the last two points always bothered me, because most of the time I was pushing some smaller changes that only affected a few files and I still deleted and reuploaded the whole application. And this is, where Git-ftp really pays off: It can upload only those files, that changed since the last upload! This is extremely useful, especially for projects with a lot of files. A few of my PHP projects e.g. use Git Submodules and uploading the whole project on each build would take an incredible amount of time. So my new workflow now looks like this:

  1. ➕ Make code changes (e.g. fixing an important security issue)
  2. 🔨 Test code changes
  3. ✅ Commit these changes to the repository
  4. 🔁 Create new build files optimized for production using vue-cli-service build
  5. Lean back and let the GitHub FTP-Deploy-Action do the rest

Submission Category

✅ DIY Deployments


So, how can you set up this FTP-Deploy-Action? You simply have to create a configuration file called ftp-deploy.yaml under your-repo/.github/workflows/. This is what my configuration looks like:

      - 'dist/*'
name: FTP Deploy
    name: FTP-Deploy-Action
    runs-on: ubuntu-latest
    - uses: actions/checkout@v2.1.0
        fetch-depth: 2
    - name: FTP-Deploy-Action
      uses: SamKirkland/FTP-Deploy-Action@master
        ftp-server: ${{ secrets.ftp_server }}
        ftp-username: ${{ secrets.ftp_username }}
        ftp-password: ${{ secrets.ftp_password }}
        local-dir: dist/

I'm going to explain every part in the following for you to understand, how this works 💡

Lines Explanation
1—4 on: push: paths:
Only start this action, when changes where pushed to the `dist/` directory (this is the default build folder for Vue CLI)
5 name:
The name of your GitHub action which is shown in the repositories action tab on GitHub.
6—15 jobs: FTP-Deploy-Action: ...
This is the default configuration for this action, accourding to its documentation.
16 with:
This section allows for further required or optional configuration of the action.
17—19 ftp-server: | ftp-username: | ftp-password:
Obviously GitHub needs to know your FTP access data like server url, username and password. Even more obviously, you don't want to store these data in this configuration file rather than as encrypted secrets. The port number is appended to the url, if you need it. Also you can specify the security protocol (see security hint below), e.g.:
20 local-dir:
This makes sure, that not the whole repository, but only (in my case) the `dist/` directory gets uploaded, where my build files live.

Bonus: If you want to explicitly exclude some files from being uploaded, you can create a .git-ftp-ignore file in the root of your repository, which works the same way as a .gitignore file.

Additional Resources / Info

Here are the repositories of the GitHub action and git-ftp:

GitHub logo SamKirkland / FTP-Deploy-Action

Deploys a GitHub project to a FTP server using GitHub actions

GitHub logo git-ftp / git-ftp

Uses Git to upload only changed files to FTP servers.

Security hint

FTP itself transfers files unencrypted. Therefore it's highly recommended to use FTPS (FTP with TLS) or SFTP (SSH file transfer), which are both supported by git-ftp. Thanks to @lampewebdev for his comment on this topic.

Wrap it up

So we saw that it's fairly simple to let GitHub deploy you build files automatically via FTP. You just need to create one configuration file and set a few repository secrets.

Let me know, if you also deploy via FTP and this is useful for your own workflows.

Edited: 4th September 2020 (add server url example and security hint)
Published: 3rd September 2020
Title image: https://codepen.io/devmount/full/qBZPpEM

Posted on by:

devmount profile



creator. developer. consultant. freelancer. javascript. php. python. css. husband. dad². guitarero. climber. retro gamer.


markdown guide

Just one thing to keep in mind:
FTP is not build to be secure.
FTP is sending usernames and passwords in clear text through the network.
FTP is vulnerable to sniffing, spoofing, and brute force attacks, among other basic attack methods.
SFTP should be used or SSH instead.


Totally agree, thank you for this addition. git-ftp supports FTP, FTPS and SFTP, you just have to specify the protocol at the beginning of the url, e.g.:


Added a corresponding security hint in the article. Thanks again 😊


You should go a step further and let the building (and testing) also be done via github actions. I've been doing that for a while now and it feels great knowing all i have to do is push my code (and or/release depending on the repo) for it to spin up, do its thing and deploy it to ftp.


Of course, this should be the logical next step. Do you also use GitHub actions for that? How do you handle failed tests?


I do use GH actions for that yes. And if a test fails i get notified via email and it will simple stop deploying :)

Cool! I looked through your GitHub repos and found one of your yml files - I will take it as a starting point to increase automation of my own workflows - thanks again 😊

Yeah i think all that is missing from that repo is the testing part. Just do a command to test and if it fails it should stop all together. The git hard -reset command is there because i had an issue earlier with the ftp plugin also uploading files i don't want to my FTP. Not sure if that is still needed. No problem and have fun with it!

Thank you for sharing your workflow 👍🏻


Nice, 😄, In some cases I upload to Godaddy windows hosting manually using File Zilla FTP client. So you mean using your workflow I can automate that ? Also how can I mention the FTP default port number ?


Exactly, you can automate that. The port number is appended to the server url, divided by colon, e.g. ftps://your.ftp-server.com:21.


I tried to use FTP deploy but it did not fit for my needs. I think it is OK for few files but when you need to move thousands of small files it is way too slow. FTP's weakness is that it transfers each file individually. Perhaps if you could have process that zips the directory then FTP transfers to the Zip and then some other process on the server unzip's it... But then why not just hooking directly to Githooks or something alike?


Oh I know exactly, what you mean! If only FTP had some kind of built in transport compression/bundleing, it would be so much faster.

The good thing about Git-ftp is, that you only have to upload all your files once. Every following upload will only contain the files that changed since the last upload (which I assume to be not that much files). You can even git ftp catchup, if your files are already existing on your server. But I agree, that there are some limitations at some point. So it depends on the project, I guess.


If you don't increase the fetch depth you'll get screwed here on more than a few commits. You'll end up reloading your whole repo not just the changes.


Yes you're right. At least the last 2 checkins are required in order to determine differences, so you can either increase the fetch depth or deploy on every commit. I'm not sure if increasing the fetch depth has any side effects though...


Whatever they are I'd bet they're worth not risking 2hr deploys by accident if you commit a few too many times before pushing.


Will this approach work for WordPress websites?


It will. However the first commit could take some time since all files have to be uploaded.